Sep 30 14:22:40 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Sep 30 14:22:40 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Sep 30 14:22:40 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 14:22:40 localhost kernel: BIOS-provided physical RAM map:
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Sep 30 14:22:40 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Sep 30 14:22:40 localhost kernel: NX (Execute Disable) protection: active
Sep 30 14:22:40 localhost kernel: APIC: Static calls initialized
Sep 30 14:22:40 localhost kernel: SMBIOS 2.8 present.
Sep 30 14:22:40 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Sep 30 14:22:40 localhost kernel: Hypervisor detected: KVM
Sep 30 14:22:40 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Sep 30 14:22:40 localhost kernel: kvm-clock: using sched offset of 5984099911 cycles
Sep 30 14:22:40 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Sep 30 14:22:40 localhost kernel: tsc: Detected 2800.000 MHz processor
Sep 30 14:22:40 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Sep 30 14:22:40 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Sep 30 14:22:40 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Sep 30 14:22:40 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Sep 30 14:22:40 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Sep 30 14:22:40 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Sep 30 14:22:40 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Sep 30 14:22:40 localhost kernel: Using GB pages for direct mapping
Sep 30 14:22:40 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Sep 30 14:22:40 localhost kernel: ACPI: Early table checksum verification disabled
Sep 30 14:22:40 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Sep 30 14:22:40 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 14:22:40 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 14:22:40 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 14:22:40 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Sep 30 14:22:40 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 14:22:40 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 14:22:40 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Sep 30 14:22:40 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Sep 30 14:22:40 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Sep 30 14:22:40 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Sep 30 14:22:40 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Sep 30 14:22:40 localhost kernel: No NUMA configuration found
Sep 30 14:22:40 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Sep 30 14:22:40 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Sep 30 14:22:40 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Sep 30 14:22:40 localhost kernel: Zone ranges:
Sep 30 14:22:40 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Sep 30 14:22:40 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Sep 30 14:22:40 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 14:22:40 localhost kernel:   Device   empty
Sep 30 14:22:40 localhost kernel: Movable zone start for each node
Sep 30 14:22:40 localhost kernel: Early memory node ranges
Sep 30 14:22:40 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Sep 30 14:22:40 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Sep 30 14:22:40 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 14:22:40 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Sep 30 14:22:40 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Sep 30 14:22:40 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Sep 30 14:22:40 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Sep 30 14:22:40 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Sep 30 14:22:40 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Sep 30 14:22:40 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Sep 30 14:22:40 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Sep 30 14:22:40 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Sep 30 14:22:40 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Sep 30 14:22:40 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Sep 30 14:22:40 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Sep 30 14:22:40 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Sep 30 14:22:40 localhost kernel: TSC deadline timer available
Sep 30 14:22:40 localhost kernel: CPU topo: Max. logical packages:   8
Sep 30 14:22:40 localhost kernel: CPU topo: Max. logical dies:       8
Sep 30 14:22:40 localhost kernel: CPU topo: Max. dies per package:   1
Sep 30 14:22:40 localhost kernel: CPU topo: Max. threads per core:   1
Sep 30 14:22:40 localhost kernel: CPU topo: Num. cores per package:     1
Sep 30 14:22:40 localhost kernel: CPU topo: Num. threads per package:   1
Sep 30 14:22:40 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Sep 30 14:22:40 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Sep 30 14:22:40 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Sep 30 14:22:40 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Sep 30 14:22:40 localhost kernel: Booting paravirtualized kernel on KVM
Sep 30 14:22:40 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Sep 30 14:22:40 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Sep 30 14:22:40 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Sep 30 14:22:40 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Sep 30 14:22:40 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Sep 30 14:22:40 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Sep 30 14:22:40 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 14:22:40 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Sep 30 14:22:40 localhost kernel: random: crng init done
Sep 30 14:22:40 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Sep 30 14:22:40 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Sep 30 14:22:40 localhost kernel: Fallback order for Node 0: 0 
Sep 30 14:22:40 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Sep 30 14:22:40 localhost kernel: Policy zone: Normal
Sep 30 14:22:40 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Sep 30 14:22:40 localhost kernel: software IO TLB: area num 8.
Sep 30 14:22:40 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Sep 30 14:22:40 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Sep 30 14:22:40 localhost kernel: ftrace: allocated 193 pages with 3 groups
Sep 30 14:22:40 localhost kernel: Dynamic Preempt: voluntary
Sep 30 14:22:40 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Sep 30 14:22:40 localhost kernel: rcu:         RCU event tracing is enabled.
Sep 30 14:22:40 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Sep 30 14:22:40 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Sep 30 14:22:40 localhost kernel:         Rude variant of Tasks RCU enabled.
Sep 30 14:22:40 localhost kernel:         Tracing variant of Tasks RCU enabled.
Sep 30 14:22:40 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Sep 30 14:22:40 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Sep 30 14:22:40 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 14:22:40 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 14:22:40 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 14:22:40 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Sep 30 14:22:40 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Sep 30 14:22:40 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Sep 30 14:22:40 localhost kernel: Console: colour VGA+ 80x25
Sep 30 14:22:40 localhost kernel: printk: console [ttyS0] enabled
Sep 30 14:22:40 localhost kernel: ACPI: Core revision 20230331
Sep 30 14:22:40 localhost kernel: APIC: Switch to symmetric I/O mode setup
Sep 30 14:22:40 localhost kernel: x2apic enabled
Sep 30 14:22:40 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Sep 30 14:22:40 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Sep 30 14:22:40 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Sep 30 14:22:40 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Sep 30 14:22:40 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Sep 30 14:22:40 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Sep 30 14:22:40 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Sep 30 14:22:40 localhost kernel: Spectre V2 : Mitigation: Retpolines
Sep 30 14:22:40 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Sep 30 14:22:40 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Sep 30 14:22:40 localhost kernel: RETBleed: Mitigation: untrained return thunk
Sep 30 14:22:40 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Sep 30 14:22:40 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Sep 30 14:22:40 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Sep 30 14:22:40 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Sep 30 14:22:40 localhost kernel: x86/bugs: return thunk changed
Sep 30 14:22:40 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Sep 30 14:22:40 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Sep 30 14:22:40 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Sep 30 14:22:40 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Sep 30 14:22:40 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Sep 30 14:22:40 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Sep 30 14:22:40 localhost kernel: Freeing SMP alternatives memory: 40K
Sep 30 14:22:40 localhost kernel: pid_max: default: 32768 minimum: 301
Sep 30 14:22:40 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Sep 30 14:22:40 localhost kernel: landlock: Up and running.
Sep 30 14:22:40 localhost kernel: Yama: becoming mindful.
Sep 30 14:22:40 localhost kernel: SELinux:  Initializing.
Sep 30 14:22:40 localhost kernel: LSM support for eBPF active
Sep 30 14:22:40 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 14:22:40 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 14:22:40 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Sep 30 14:22:40 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Sep 30 14:22:40 localhost kernel: ... version:                0
Sep 30 14:22:40 localhost kernel: ... bit width:              48
Sep 30 14:22:40 localhost kernel: ... generic registers:      6
Sep 30 14:22:40 localhost kernel: ... value mask:             0000ffffffffffff
Sep 30 14:22:40 localhost kernel: ... max period:             00007fffffffffff
Sep 30 14:22:40 localhost kernel: ... fixed-purpose events:   0
Sep 30 14:22:40 localhost kernel: ... event mask:             000000000000003f
Sep 30 14:22:40 localhost kernel: signal: max sigframe size: 1776
Sep 30 14:22:40 localhost kernel: rcu: Hierarchical SRCU implementation.
Sep 30 14:22:40 localhost kernel: rcu:         Max phase no-delay instances is 400.
Sep 30 14:22:40 localhost kernel: smp: Bringing up secondary CPUs ...
Sep 30 14:22:40 localhost kernel: smpboot: x86: Booting SMP configuration:
Sep 30 14:22:40 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Sep 30 14:22:40 localhost kernel: smp: Brought up 1 node, 8 CPUs
Sep 30 14:22:40 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Sep 30 14:22:40 localhost kernel: node 0 deferred pages initialised in 26ms
Sep 30 14:22:40 localhost kernel: Memory: 7765432K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616480K reserved, 0K cma-reserved)
Sep 30 14:22:40 localhost kernel: devtmpfs: initialized
Sep 30 14:22:40 localhost kernel: x86/mm: Memory block size: 128MB
Sep 30 14:22:40 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Sep 30 14:22:40 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Sep 30 14:22:40 localhost kernel: pinctrl core: initialized pinctrl subsystem
Sep 30 14:22:40 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Sep 30 14:22:40 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Sep 30 14:22:40 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Sep 30 14:22:40 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Sep 30 14:22:40 localhost kernel: audit: initializing netlink subsys (disabled)
Sep 30 14:22:40 localhost kernel: audit: type=2000 audit(1759242158.047:1): state=initialized audit_enabled=0 res=1
Sep 30 14:22:40 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Sep 30 14:22:40 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Sep 30 14:22:40 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Sep 30 14:22:40 localhost kernel: cpuidle: using governor menu
Sep 30 14:22:40 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Sep 30 14:22:40 localhost kernel: PCI: Using configuration type 1 for base access
Sep 30 14:22:40 localhost kernel: PCI: Using configuration type 1 for extended access
Sep 30 14:22:40 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Sep 30 14:22:40 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Sep 30 14:22:40 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Sep 30 14:22:40 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Sep 30 14:22:40 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Sep 30 14:22:40 localhost kernel: Demotion targets for Node 0: null
Sep 30 14:22:40 localhost kernel: cryptd: max_cpu_qlen set to 1000
Sep 30 14:22:40 localhost kernel: ACPI: Added _OSI(Module Device)
Sep 30 14:22:40 localhost kernel: ACPI: Added _OSI(Processor Device)
Sep 30 14:22:40 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Sep 30 14:22:40 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Sep 30 14:22:40 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Sep 30 14:22:40 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Sep 30 14:22:40 localhost kernel: ACPI: Interpreter enabled
Sep 30 14:22:40 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Sep 30 14:22:40 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Sep 30 14:22:40 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Sep 30 14:22:40 localhost kernel: PCI: Using E820 reservations for host bridge windows
Sep 30 14:22:40 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Sep 30 14:22:40 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Sep 30 14:22:40 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [3] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [4] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [5] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [6] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [7] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [8] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [9] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [10] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [11] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [12] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [13] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [14] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [15] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [16] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [17] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [18] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [19] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [20] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [21] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [22] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [23] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [24] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [25] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [26] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [27] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [28] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [29] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [30] registered
Sep 30 14:22:40 localhost kernel: acpiphp: Slot [31] registered
Sep 30 14:22:40 localhost kernel: PCI host bridge to bus 0000:00
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Sep 30 14:22:40 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Sep 30 14:22:40 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Sep 30 14:22:40 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 14:22:40 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Sep 30 14:22:40 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Sep 30 14:22:40 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Sep 30 14:22:40 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Sep 30 14:22:40 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Sep 30 14:22:40 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Sep 30 14:22:40 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Sep 30 14:22:40 localhost kernel: iommu: Default domain type: Translated
Sep 30 14:22:40 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Sep 30 14:22:40 localhost kernel: SCSI subsystem initialized
Sep 30 14:22:40 localhost kernel: ACPI: bus type USB registered
Sep 30 14:22:40 localhost kernel: usbcore: registered new interface driver usbfs
Sep 30 14:22:40 localhost kernel: usbcore: registered new interface driver hub
Sep 30 14:22:40 localhost kernel: usbcore: registered new device driver usb
Sep 30 14:22:40 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Sep 30 14:22:40 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Sep 30 14:22:40 localhost kernel: PTP clock support registered
Sep 30 14:22:40 localhost kernel: EDAC MC: Ver: 3.0.0
Sep 30 14:22:40 localhost kernel: NetLabel: Initializing
Sep 30 14:22:40 localhost kernel: NetLabel:  domain hash size = 128
Sep 30 14:22:40 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Sep 30 14:22:40 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Sep 30 14:22:40 localhost kernel: PCI: Using ACPI for IRQ routing
Sep 30 14:22:40 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Sep 30 14:22:40 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Sep 30 14:22:40 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Sep 30 14:22:40 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Sep 30 14:22:40 localhost kernel: vgaarb: loaded
Sep 30 14:22:40 localhost kernel: clocksource: Switched to clocksource kvm-clock
Sep 30 14:22:40 localhost kernel: VFS: Disk quotas dquot_6.6.0
Sep 30 14:22:40 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Sep 30 14:22:40 localhost kernel: pnp: PnP ACPI init
Sep 30 14:22:40 localhost kernel: pnp 00:03: [dma 2]
Sep 30 14:22:40 localhost kernel: pnp: PnP ACPI: found 5 devices
Sep 30 14:22:40 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Sep 30 14:22:40 localhost kernel: NET: Registered PF_INET protocol family
Sep 30 14:22:40 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Sep 30 14:22:40 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Sep 30 14:22:40 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Sep 30 14:22:40 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Sep 30 14:22:40 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Sep 30 14:22:40 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Sep 30 14:22:40 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Sep 30 14:22:40 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 14:22:40 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 14:22:40 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Sep 30 14:22:40 localhost kernel: NET: Registered PF_XDP protocol family
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Sep 30 14:22:40 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Sep 30 14:22:40 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Sep 30 14:22:40 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Sep 30 14:22:40 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 72547 usecs
Sep 30 14:22:40 localhost kernel: PCI: CLS 0 bytes, default 64
Sep 30 14:22:40 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Sep 30 14:22:40 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Sep 30 14:22:40 localhost kernel: ACPI: bus type thunderbolt registered
Sep 30 14:22:40 localhost kernel: Trying to unpack rootfs image as initramfs...
Sep 30 14:22:40 localhost kernel: Initialise system trusted keyrings
Sep 30 14:22:40 localhost kernel: Key type blacklist registered
Sep 30 14:22:40 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Sep 30 14:22:40 localhost kernel: zbud: loaded
Sep 30 14:22:40 localhost kernel: integrity: Platform Keyring initialized
Sep 30 14:22:40 localhost kernel: integrity: Machine keyring initialized
Sep 30 14:22:40 localhost kernel: Freeing initrd memory: 86080K
Sep 30 14:22:40 localhost kernel: NET: Registered PF_ALG protocol family
Sep 30 14:22:40 localhost kernel: xor: automatically using best checksumming function   avx       
Sep 30 14:22:40 localhost kernel: Key type asymmetric registered
Sep 30 14:22:40 localhost kernel: Asymmetric key parser 'x509' registered
Sep 30 14:22:40 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Sep 30 14:22:40 localhost kernel: io scheduler mq-deadline registered
Sep 30 14:22:40 localhost kernel: io scheduler kyber registered
Sep 30 14:22:40 localhost kernel: io scheduler bfq registered
Sep 30 14:22:40 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Sep 30 14:22:40 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Sep 30 14:22:40 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Sep 30 14:22:40 localhost kernel: ACPI: button: Power Button [PWRF]
Sep 30 14:22:40 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Sep 30 14:22:40 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Sep 30 14:22:40 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Sep 30 14:22:40 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Sep 30 14:22:40 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Sep 30 14:22:40 localhost kernel: Non-volatile memory driver v1.3
Sep 30 14:22:40 localhost kernel: rdac: device handler registered
Sep 30 14:22:40 localhost kernel: hp_sw: device handler registered
Sep 30 14:22:40 localhost kernel: emc: device handler registered
Sep 30 14:22:40 localhost kernel: alua: device handler registered
Sep 30 14:22:40 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Sep 30 14:22:40 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Sep 30 14:22:40 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Sep 30 14:22:40 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Sep 30 14:22:40 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Sep 30 14:22:40 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Sep 30 14:22:40 localhost kernel: usb usb1: Product: UHCI Host Controller
Sep 30 14:22:40 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Sep 30 14:22:40 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Sep 30 14:22:40 localhost kernel: hub 1-0:1.0: USB hub found
Sep 30 14:22:40 localhost kernel: hub 1-0:1.0: 2 ports detected
Sep 30 14:22:40 localhost kernel: usbcore: registered new interface driver usbserial_generic
Sep 30 14:22:40 localhost kernel: usbserial: USB Serial support registered for generic
Sep 30 14:22:40 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Sep 30 14:22:40 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Sep 30 14:22:40 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Sep 30 14:22:40 localhost kernel: mousedev: PS/2 mouse device common for all mice
Sep 30 14:22:40 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Sep 30 14:22:40 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Sep 30 14:22:40 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Sep 30 14:22:40 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Sep 30 14:22:40 localhost kernel: rtc_cmos 00:04: registered as rtc0
Sep 30 14:22:40 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-09-30T14:22:39 UTC (1759242159)
Sep 30 14:22:40 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Sep 30 14:22:40 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Sep 30 14:22:40 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Sep 30 14:22:40 localhost kernel: usbcore: registered new interface driver usbhid
Sep 30 14:22:40 localhost kernel: usbhid: USB HID core driver
Sep 30 14:22:40 localhost kernel: drop_monitor: Initializing network drop monitor service
Sep 30 14:22:40 localhost kernel: Initializing XFRM netlink socket
Sep 30 14:22:40 localhost kernel: NET: Registered PF_INET6 protocol family
Sep 30 14:22:40 localhost kernel: Segment Routing with IPv6
Sep 30 14:22:40 localhost kernel: NET: Registered PF_PACKET protocol family
Sep 30 14:22:40 localhost kernel: mpls_gso: MPLS GSO support
Sep 30 14:22:40 localhost kernel: IPI shorthand broadcast: enabled
Sep 30 14:22:40 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Sep 30 14:22:40 localhost kernel: AES CTR mode by8 optimization enabled
Sep 30 14:22:40 localhost kernel: sched_clock: Marking stable (1174010339, 165110950)->(1461724299, -122603010)
Sep 30 14:22:40 localhost kernel: registered taskstats version 1
Sep 30 14:22:40 localhost kernel: Loading compiled-in X.509 certificates
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Sep 30 14:22:40 localhost kernel: Demotion targets for Node 0: null
Sep 30 14:22:40 localhost kernel: page_owner is disabled
Sep 30 14:22:40 localhost kernel: Key type .fscrypt registered
Sep 30 14:22:40 localhost kernel: Key type fscrypt-provisioning registered
Sep 30 14:22:40 localhost kernel: Key type big_key registered
Sep 30 14:22:40 localhost kernel: Key type encrypted registered
Sep 30 14:22:40 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Sep 30 14:22:40 localhost kernel: Loading compiled-in module X.509 certificates
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 14:22:40 localhost kernel: ima: Allocated hash algorithm: sha256
Sep 30 14:22:40 localhost kernel: ima: No architecture policies found
Sep 30 14:22:40 localhost kernel: evm: Initialising EVM extended attributes:
Sep 30 14:22:40 localhost kernel: evm: security.selinux
Sep 30 14:22:40 localhost kernel: evm: security.SMACK64 (disabled)
Sep 30 14:22:40 localhost kernel: evm: security.SMACK64EXEC (disabled)
Sep 30 14:22:40 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Sep 30 14:22:40 localhost kernel: evm: security.SMACK64MMAP (disabled)
Sep 30 14:22:40 localhost kernel: evm: security.apparmor (disabled)
Sep 30 14:22:40 localhost kernel: evm: security.ima
Sep 30 14:22:40 localhost kernel: evm: security.capability
Sep 30 14:22:40 localhost kernel: evm: HMAC attrs: 0x1
Sep 30 14:22:40 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Sep 30 14:22:40 localhost kernel: Running certificate verification RSA selftest
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Sep 30 14:22:40 localhost kernel: Running certificate verification ECDSA selftest
Sep 30 14:22:40 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Sep 30 14:22:40 localhost kernel: clk: Disabling unused clocks
Sep 30 14:22:40 localhost kernel: Freeing unused decrypted memory: 2028K
Sep 30 14:22:40 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Sep 30 14:22:40 localhost kernel: Write protecting the kernel read-only data: 30720k
Sep 30 14:22:40 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Sep 30 14:22:40 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Sep 30 14:22:40 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Sep 30 14:22:40 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Sep 30 14:22:40 localhost kernel: usb 1-1: Manufacturer: QEMU
Sep 30 14:22:40 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Sep 30 14:22:40 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Sep 30 14:22:40 localhost kernel: Run /init as init process
Sep 30 14:22:40 localhost kernel:   with arguments:
Sep 30 14:22:40 localhost kernel:     /init
Sep 30 14:22:40 localhost kernel:   with environment:
Sep 30 14:22:40 localhost kernel:     HOME=/
Sep 30 14:22:40 localhost kernel:     TERM=linux
Sep 30 14:22:40 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Sep 30 14:22:40 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Sep 30 14:22:40 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Sep 30 14:22:40 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 14:22:40 localhost systemd[1]: Detected virtualization kvm.
Sep 30 14:22:40 localhost systemd[1]: Detected architecture x86-64.
Sep 30 14:22:40 localhost systemd[1]: Running in initrd.
Sep 30 14:22:40 localhost systemd[1]: No hostname configured, using default hostname.
Sep 30 14:22:40 localhost systemd[1]: Hostname set to <localhost>.
Sep 30 14:22:40 localhost systemd[1]: Initializing machine ID from VM UUID.
Sep 30 14:22:40 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Sep 30 14:22:40 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 14:22:40 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 14:22:40 localhost systemd[1]: Reached target Initrd /usr File System.
Sep 30 14:22:40 localhost systemd[1]: Reached target Local File Systems.
Sep 30 14:22:40 localhost systemd[1]: Reached target Path Units.
Sep 30 14:22:40 localhost systemd[1]: Reached target Slice Units.
Sep 30 14:22:40 localhost systemd[1]: Reached target Swaps.
Sep 30 14:22:40 localhost systemd[1]: Reached target Timer Units.
Sep 30 14:22:40 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 14:22:40 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Sep 30 14:22:40 localhost systemd[1]: Listening on Journal Socket.
Sep 30 14:22:40 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 14:22:40 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 14:22:40 localhost systemd[1]: Reached target Socket Units.
Sep 30 14:22:40 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 14:22:40 localhost systemd[1]: Starting Journal Service...
Sep 30 14:22:40 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 14:22:40 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 14:22:40 localhost systemd[1]: Starting Create System Users...
Sep 30 14:22:40 localhost systemd[1]: Starting Setup Virtual Console...
Sep 30 14:22:40 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 14:22:40 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 14:22:40 localhost systemd[1]: Finished Create System Users.
Sep 30 14:22:40 localhost systemd-journald[308]: Journal started
Sep 30 14:22:40 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/12ce99dadb914763aecd1e4b4dea5907) is 8.0M, max 153.5M, 145.5M free.
Sep 30 14:22:40 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Sep 30 14:22:40 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Sep 30 14:22:40 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Sep 30 14:22:40 localhost systemd[1]: Started Journal Service.
Sep 30 14:22:40 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 14:22:40 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 14:22:40 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 14:22:40 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 14:22:40 localhost systemd[1]: Finished Setup Virtual Console.
Sep 30 14:22:40 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Sep 30 14:22:40 localhost systemd[1]: Starting dracut cmdline hook...
Sep 30 14:22:40 localhost dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Sep 30 14:22:40 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 14:22:40 localhost systemd[1]: Finished dracut cmdline hook.
Sep 30 14:22:40 localhost systemd[1]: Starting dracut pre-udev hook...
Sep 30 14:22:40 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Sep 30 14:22:40 localhost kernel: device-mapper: uevent: version 1.0.3
Sep 30 14:22:40 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Sep 30 14:22:40 localhost kernel: RPC: Registered named UNIX socket transport module.
Sep 30 14:22:40 localhost kernel: RPC: Registered udp transport module.
Sep 30 14:22:40 localhost kernel: RPC: Registered tcp transport module.
Sep 30 14:22:40 localhost kernel: RPC: Registered tcp-with-tls transport module.
Sep 30 14:22:40 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 30 14:22:40 localhost rpc.statd[444]: Version 2.5.4 starting
Sep 30 14:22:40 localhost rpc.statd[444]: Initializing NSM state
Sep 30 14:22:40 localhost rpc.idmapd[449]: Setting log level to 0
Sep 30 14:22:40 localhost systemd[1]: Finished dracut pre-udev hook.
Sep 30 14:22:40 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 14:22:40 localhost systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 14:22:40 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 14:22:40 localhost systemd[1]: Starting dracut pre-trigger hook...
Sep 30 14:22:40 localhost systemd[1]: Finished dracut pre-trigger hook.
Sep 30 14:22:40 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 14:22:40 localhost systemd[1]: Created slice Slice /system/modprobe.
Sep 30 14:22:40 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 14:22:40 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 14:22:40 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 14:22:40 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 14:22:40 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 14:22:40 localhost systemd[1]: Reached target Network.
Sep 30 14:22:40 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 14:22:40 localhost systemd[1]: Starting dracut initqueue hook...
Sep 30 14:22:40 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Sep 30 14:22:40 localhost kernel: libata version 3.00 loaded.
Sep 30 14:22:41 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Sep 30 14:22:41 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Sep 30 14:22:41 localhost systemd-udevd[499]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 14:22:41 localhost kernel: scsi host0: ata_piix
Sep 30 14:22:41 localhost kernel: scsi host1: ata_piix
Sep 30 14:22:41 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Sep 30 14:22:41 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Sep 30 14:22:41 localhost kernel:  vda: vda1
Sep 30 14:22:41 localhost systemd[1]: Mounting Kernel Configuration File System...
Sep 30 14:22:41 localhost kernel: ata1: found unknown device (class 0)
Sep 30 14:22:41 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Sep 30 14:22:41 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Sep 30 14:22:41 localhost systemd[1]: Mounted Kernel Configuration File System.
Sep 30 14:22:41 localhost systemd[1]: Reached target System Initialization.
Sep 30 14:22:41 localhost systemd[1]: Reached target Basic System.
Sep 30 14:22:41 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Sep 30 14:22:41 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Sep 30 14:22:41 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Sep 30 14:22:41 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 14:22:41 localhost systemd[1]: Reached target Initrd Root Device.
Sep 30 14:22:41 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Sep 30 14:22:41 localhost systemd[1]: Finished dracut initqueue hook.
Sep 30 14:22:41 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 14:22:41 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Sep 30 14:22:41 localhost systemd[1]: Reached target Remote File Systems.
Sep 30 14:22:41 localhost systemd[1]: Starting dracut pre-mount hook...
Sep 30 14:22:41 localhost systemd[1]: Finished dracut pre-mount hook.
Sep 30 14:22:41 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Sep 30 14:22:41 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Sep 30 14:22:41 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 14:22:41 localhost systemd[1]: Mounting /sysroot...
Sep 30 14:22:41 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Sep 30 14:22:41 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Sep 30 14:22:41 localhost kernel: XFS (vda1): Ending clean mount
Sep 30 14:22:41 localhost systemd[1]: Mounted /sysroot.
Sep 30 14:22:41 localhost systemd[1]: Reached target Initrd Root File System.
Sep 30 14:22:41 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Sep 30 14:22:41 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Sep 30 14:22:41 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Sep 30 14:22:42 localhost systemd[1]: Reached target Initrd File Systems.
Sep 30 14:22:42 localhost systemd[1]: Reached target Initrd Default Target.
Sep 30 14:22:42 localhost systemd[1]: Starting dracut mount hook...
Sep 30 14:22:42 localhost systemd[1]: Finished dracut mount hook.
Sep 30 14:22:42 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Sep 30 14:22:42 localhost rpc.idmapd[449]: exiting on signal 15
Sep 30 14:22:42 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Sep 30 14:22:42 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Sep 30 14:22:42 localhost systemd[1]: Stopped target Network.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Timer Units.
Sep 30 14:22:42 localhost systemd[1]: dbus.socket: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Sep 30 14:22:42 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Initrd Default Target.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Basic System.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Initrd Root Device.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Initrd /usr File System.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Path Units.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Remote File Systems.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Slice Units.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Socket Units.
Sep 30 14:22:42 localhost systemd[1]: Stopped target System Initialization.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Local File Systems.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Swaps.
Sep 30 14:22:42 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut mount hook.
Sep 30 14:22:42 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut pre-mount hook.
Sep 30 14:22:42 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Sep 30 14:22:42 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Sep 30 14:22:42 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut initqueue hook.
Sep 30 14:22:42 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Apply Kernel Variables.
Sep 30 14:22:42 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Sep 30 14:22:42 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Coldplug All udev Devices.
Sep 30 14:22:42 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut pre-trigger hook.
Sep 30 14:22:42 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Sep 30 14:22:42 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Setup Virtual Console.
Sep 30 14:22:42 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Sep 30 14:22:42 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Sep 30 14:22:42 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Closed udev Control Socket.
Sep 30 14:22:42 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Closed udev Kernel Socket.
Sep 30 14:22:42 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut pre-udev hook.
Sep 30 14:22:42 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped dracut cmdline hook.
Sep 30 14:22:42 localhost systemd[1]: Starting Cleanup udev Database...
Sep 30 14:22:42 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Sep 30 14:22:42 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Sep 30 14:22:42 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Stopped Create System Users.
Sep 30 14:22:42 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Sep 30 14:22:42 localhost systemd[1]: Finished Cleanup udev Database.
Sep 30 14:22:42 localhost systemd[1]: Reached target Switch Root.
Sep 30 14:22:42 localhost systemd[1]: Starting Switch Root...
Sep 30 14:22:42 localhost systemd[1]: Switching root.
Sep 30 14:22:42 localhost systemd-journald[308]: Journal stopped
Sep 30 14:22:44 localhost systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Sep 30 14:22:44 localhost kernel: audit: type=1404 audit(1759242162.640:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability open_perms=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability always_check_network=0
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 14:22:44 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 14:22:44 localhost kernel: audit: type=1403 audit(1759242162.851:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Sep 30 14:22:44 localhost systemd[1]: Successfully loaded SELinux policy in 215.604ms.
Sep 30 14:22:44 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 29.960ms.
Sep 30 14:22:44 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 14:22:44 localhost systemd[1]: Detected virtualization kvm.
Sep 30 14:22:44 localhost systemd[1]: Detected architecture x86-64.
Sep 30 14:22:44 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 14:22:44 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Stopped Switch Root.
Sep 30 14:22:44 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Sep 30 14:22:44 localhost systemd[1]: Created slice Slice /system/getty.
Sep 30 14:22:44 localhost systemd[1]: Created slice Slice /system/serial-getty.
Sep 30 14:22:44 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Sep 30 14:22:44 localhost systemd[1]: Created slice User and Session Slice.
Sep 30 14:22:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 14:22:44 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Sep 30 14:22:44 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Sep 30 14:22:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 14:22:44 localhost systemd[1]: Stopped target Switch Root.
Sep 30 14:22:44 localhost systemd[1]: Stopped target Initrd File Systems.
Sep 30 14:22:44 localhost systemd[1]: Stopped target Initrd Root File System.
Sep 30 14:22:44 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Sep 30 14:22:44 localhost systemd[1]: Reached target Path Units.
Sep 30 14:22:44 localhost systemd[1]: Reached target rpc_pipefs.target.
Sep 30 14:22:44 localhost systemd[1]: Reached target Slice Units.
Sep 30 14:22:44 localhost systemd[1]: Reached target Swaps.
Sep 30 14:22:44 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Sep 30 14:22:44 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Sep 30 14:22:44 localhost systemd[1]: Reached target RPC Port Mapper.
Sep 30 14:22:44 localhost systemd[1]: Listening on Process Core Dump Socket.
Sep 30 14:22:44 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Sep 30 14:22:44 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 14:22:44 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 14:22:44 localhost systemd[1]: Mounting Huge Pages File System...
Sep 30 14:22:44 localhost systemd[1]: Mounting POSIX Message Queue File System...
Sep 30 14:22:44 localhost systemd[1]: Mounting Kernel Debug File System...
Sep 30 14:22:44 localhost systemd[1]: Mounting Kernel Trace File System...
Sep 30 14:22:44 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 14:22:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 14:22:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 14:22:44 localhost systemd[1]: Starting Load Kernel Module drm...
Sep 30 14:22:44 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Sep 30 14:22:44 localhost systemd[1]: Starting Load Kernel Module fuse...
Sep 30 14:22:44 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Sep 30 14:22:44 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Stopped File System Check on Root Device.
Sep 30 14:22:44 localhost systemd[1]: Stopped Journal Service.
Sep 30 14:22:44 localhost systemd[1]: Starting Journal Service...
Sep 30 14:22:44 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 14:22:44 localhost systemd[1]: Starting Generate network units from Kernel command line...
Sep 30 14:22:44 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 14:22:44 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Sep 30 14:22:44 localhost kernel: fuse: init (API version 7.37)
Sep 30 14:22:44 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Sep 30 14:22:44 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 14:22:44 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 14:22:44 localhost systemd-journald[679]: Journal started
Sep 30 14:22:44 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 14:22:43 localhost systemd[1]: Queued start job for default target Multi-User System.
Sep 30 14:22:43 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Started Journal Service.
Sep 30 14:22:44 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Sep 30 14:22:44 localhost systemd[1]: Mounted Huge Pages File System.
Sep 30 14:22:44 localhost systemd[1]: Mounted POSIX Message Queue File System.
Sep 30 14:22:44 localhost systemd[1]: Mounted Kernel Debug File System.
Sep 30 14:22:44 localhost systemd[1]: Mounted Kernel Trace File System.
Sep 30 14:22:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 14:22:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 14:22:44 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Sep 30 14:22:44 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Finished Load Kernel Module fuse.
Sep 30 14:22:44 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Sep 30 14:22:44 localhost systemd[1]: Finished Generate network units from Kernel command line.
Sep 30 14:22:44 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Sep 30 14:22:44 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 14:22:44 localhost kernel: ACPI: bus type drm_connector registered
Sep 30 14:22:44 localhost systemd[1]: Mounting FUSE Control File System...
Sep 30 14:22:44 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 14:22:44 localhost systemd[1]: Starting Rebuild Hardware Database...
Sep 30 14:22:44 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Sep 30 14:22:44 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Sep 30 14:22:44 localhost systemd[1]: Starting Load/Save OS Random Seed...
Sep 30 14:22:44 localhost systemd[1]: Starting Create System Users...
Sep 30 14:22:44 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Finished Load Kernel Module drm.
Sep 30 14:22:44 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 14:22:44 localhost systemd-journald[679]: Received client request to flush runtime journal.
Sep 30 14:22:44 localhost systemd[1]: Mounted FUSE Control File System.
Sep 30 14:22:44 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Sep 30 14:22:44 localhost systemd[1]: Finished Load/Save OS Random Seed.
Sep 30 14:22:44 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 14:22:44 localhost systemd[1]: Finished Create System Users.
Sep 30 14:22:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 14:22:44 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 14:22:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 14:22:44 localhost systemd[1]: Reached target Preparation for Local File Systems.
Sep 30 14:22:44 localhost systemd[1]: Reached target Local File Systems.
Sep 30 14:22:44 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Sep 30 14:22:44 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Sep 30 14:22:44 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Sep 30 14:22:44 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Sep 30 14:22:44 localhost systemd[1]: Starting Automatic Boot Loader Update...
Sep 30 14:22:44 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Sep 30 14:22:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 14:22:44 localhost bootctl[698]: Couldn't find EFI system partition, skipping.
Sep 30 14:22:44 localhost systemd[1]: Finished Automatic Boot Loader Update.
Sep 30 14:22:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 14:22:44 localhost systemd[1]: Starting Security Auditing Service...
Sep 30 14:22:44 localhost systemd[1]: Starting RPC Bind...
Sep 30 14:22:44 localhost systemd[1]: Starting Rebuild Journal Catalog...
Sep 30 14:22:44 localhost auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Sep 30 14:22:44 localhost auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Sep 30 14:22:44 localhost systemd[1]: Finished Rebuild Journal Catalog.
Sep 30 14:22:44 localhost systemd[1]: Started RPC Bind.
Sep 30 14:22:44 localhost augenrules[709]: /sbin/augenrules: No change
Sep 30 14:22:44 localhost augenrules[724]: No rules
Sep 30 14:22:44 localhost augenrules[724]: enabled 1
Sep 30 14:22:44 localhost augenrules[724]: failure 1
Sep 30 14:22:44 localhost augenrules[724]: pid 704
Sep 30 14:22:44 localhost augenrules[724]: rate_limit 0
Sep 30 14:22:44 localhost augenrules[724]: backlog_limit 8192
Sep 30 14:22:44 localhost augenrules[724]: lost 0
Sep 30 14:22:44 localhost augenrules[724]: backlog 3
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time 60000
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time_actual 0
Sep 30 14:22:44 localhost augenrules[724]: enabled 1
Sep 30 14:22:44 localhost augenrules[724]: failure 1
Sep 30 14:22:44 localhost augenrules[724]: pid 704
Sep 30 14:22:44 localhost augenrules[724]: rate_limit 0
Sep 30 14:22:44 localhost augenrules[724]: backlog_limit 8192
Sep 30 14:22:44 localhost augenrules[724]: lost 0
Sep 30 14:22:44 localhost augenrules[724]: backlog 4
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time 60000
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time_actual 0
Sep 30 14:22:44 localhost augenrules[724]: enabled 1
Sep 30 14:22:44 localhost augenrules[724]: failure 1
Sep 30 14:22:44 localhost augenrules[724]: pid 704
Sep 30 14:22:44 localhost augenrules[724]: rate_limit 0
Sep 30 14:22:44 localhost augenrules[724]: backlog_limit 8192
Sep 30 14:22:44 localhost augenrules[724]: lost 0
Sep 30 14:22:44 localhost augenrules[724]: backlog 4
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time 60000
Sep 30 14:22:44 localhost augenrules[724]: backlog_wait_time_actual 0
Sep 30 14:22:44 localhost systemd[1]: Started Security Auditing Service.
Sep 30 14:22:44 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Sep 30 14:22:44 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Sep 30 14:22:44 localhost systemd[1]: Finished Rebuild Hardware Database.
Sep 30 14:22:44 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 14:22:44 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Sep 30 14:22:44 localhost systemd[1]: Starting Update is Completed...
Sep 30 14:22:44 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 14:22:44 localhost systemd[1]: Finished Update is Completed.
Sep 30 14:22:44 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 14:22:44 localhost systemd[1]: Reached target System Initialization.
Sep 30 14:22:44 localhost systemd[1]: Started dnf makecache --timer.
Sep 30 14:22:44 localhost systemd[1]: Started Daily rotation of log files.
Sep 30 14:22:44 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Sep 30 14:22:44 localhost systemd[1]: Reached target Timer Units.
Sep 30 14:22:44 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 14:22:44 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Sep 30 14:22:44 localhost systemd[1]: Reached target Socket Units.
Sep 30 14:22:44 localhost systemd[1]: Starting D-Bus System Message Bus...
Sep 30 14:22:44 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 14:22:44 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Sep 30 14:22:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 14:22:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 14:22:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 14:22:44 localhost systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 14:22:44 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Sep 30 14:22:44 localhost systemd[1]: Started D-Bus System Message Bus.
Sep 30 14:22:44 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Sep 30 14:22:44 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Sep 30 14:22:44 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Sep 30 14:22:44 localhost systemd[1]: Reached target Basic System.
Sep 30 14:22:44 localhost dbus-broker-lau[766]: Ready
Sep 30 14:22:44 localhost systemd[1]: Starting NTP client/server...
Sep 30 14:22:44 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Sep 30 14:22:44 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Sep 30 14:22:44 localhost systemd[1]: Starting IPv4 firewall with iptables...
Sep 30 14:22:44 localhost systemd[1]: Started irqbalance daemon.
Sep 30 14:22:44 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Sep 30 14:22:44 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 14:22:44 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 14:22:44 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 14:22:44 localhost systemd[1]: Reached target sshd-keygen.target.
Sep 30 14:22:44 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Sep 30 14:22:44 localhost systemd[1]: Reached target User and Group Name Lookups.
Sep 30 14:22:45 localhost systemd[1]: Starting User Login Management...
Sep 30 14:22:45 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Sep 30 14:22:45 localhost chronyd[803]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 14:22:45 localhost chronyd[803]: Loaded 0 symmetric keys
Sep 30 14:22:45 localhost systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 14:22:45 localhost systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 14:22:45 localhost systemd-logind[789]: New seat seat0.
Sep 30 14:22:45 localhost systemd[1]: Started User Login Management.
Sep 30 14:22:45 localhost chronyd[803]: Using right/UTC timezone to obtain leap second data
Sep 30 14:22:45 localhost chronyd[803]: Loaded seccomp filter (level 2)
Sep 30 14:22:45 localhost systemd[1]: Started NTP client/server.
Sep 30 14:22:45 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Sep 30 14:22:45 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Sep 30 14:22:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Sep 30 14:22:45 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Sep 30 14:22:45 localhost kernel: Console: switching to colour dummy device 80x25
Sep 30 14:22:45 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Sep 30 14:22:45 localhost kernel: [drm] features: -context_init
Sep 30 14:22:45 localhost kernel: [drm] number of scanouts: 1
Sep 30 14:22:45 localhost kernel: [drm] number of cap sets: 0
Sep 30 14:22:45 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Sep 30 14:22:45 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Sep 30 14:22:45 localhost kernel: Console: switching to colour frame buffer device 128x48
Sep 30 14:22:45 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Sep 30 14:22:45 localhost kernel: kvm_amd: TSC scaling supported
Sep 30 14:22:45 localhost kernel: kvm_amd: Nested Virtualization enabled
Sep 30 14:22:45 localhost kernel: kvm_amd: Nested Paging enabled
Sep 30 14:22:45 localhost kernel: kvm_amd: LBR virtualization supported
Sep 30 14:22:45 localhost iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Sep 30 14:22:45 localhost systemd[1]: Finished IPv4 firewall with iptables.
Sep 30 14:22:46 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 30 Sep 2025 14:22:45 +0000. Up 7.62 seconds.
Sep 30 14:22:46 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Sep 30 14:22:46 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Sep 30 14:22:46 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp5frsgdma.mount: Deactivated successfully.
Sep 30 14:22:46 localhost systemd[1]: Starting Hostname Service...
Sep 30 14:22:46 localhost systemd[1]: Started Hostname Service.
Sep 30 14:22:46 np0005463148.novalocal systemd-hostnamed[855]: Hostname set to <np0005463148.novalocal> (static)
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Reached target Preparation for Network.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Starting Network Manager...
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5697] NetworkManager (version 1.54.1-1.el9) is starting... (boot:cf2a7137-0e0f-4f1a-866e-63b8011fce6c)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5701] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5834] manager[0x564577536080]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5873] hostname: hostname: using hostnamed
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5874] hostname: static hostname changed from (none) to "np0005463148.novalocal"
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.5879] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6026] manager[0x564577536080]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6027] manager[0x564577536080]: rfkill: WWAN hardware radio set enabled
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6129] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6130] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6130] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6131] manager: Networking is enabled by state file
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6133] settings: Loaded settings plugin: keyfile (internal)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6165] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6189] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6214] dhcp: init: Using DHCP client 'internal'
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6216] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6229] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6239] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6245] device (lo): Activation: starting connection 'lo' (9129f00f-203c-42c0-b87c-17b7d284cfa5)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6253] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6255] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6284] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6288] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6290] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6293] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6295] device (eth0): carrier: link connected
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6297] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6302] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6309] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6312] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6312] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6314] manager: NetworkManager state is now CONNECTING
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6315] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6320] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6323] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Started Network Manager.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Reached target Network.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6550] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6552] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.6559] device (lo): Activation: successful, device activated.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Reached target NFS client services.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Reached target Remote File Systems.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9664] dhcp4 (eth0): state changed new lease, address=38.102.83.102
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9676] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9698] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9731] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9733] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9735] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9738] device (eth0): Activation: successful, device activated.
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9744] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 14:22:46 np0005463148.novalocal NetworkManager[859]: <info>  [1759242166.9748] manager: startup complete
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 14:22:46 np0005463148.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 30 Sep 2025 14:22:47 +0000. Up 8.92 seconds.
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.102         | 255.255.255.0 | global | fa:16:3e:ac:cc:c9 |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:feac:ccc9/64 |       .       |  link  | fa:16:3e:ac:cc:c9 |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Sep 30 14:22:47 np0005463148.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Sep 30 14:22:48 np0005463148.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Generating public/private rsa key pair.
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: SHA256:nJJP9hX3xVRKYlXqoQ5YuLaStrgy/gRqp3ztVehf5dM root@np0005463148.novalocal
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +---[RSA 3072]----+
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |            o.o.=|
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |         . . o = |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |        . . . = o|
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |       o.=   = o.|
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |  .   o.S.. o.. .|
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | . .  .B.o +o .  |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |.. .o +oo ...o E |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |o =o +.o. .   .  |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | +o+=o.  .       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: SHA256:yyBe5l7aB8UyOOIFNseSrc2qucOxok1c0i1zeKHrbY0 root@np0005463148.novalocal
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +---[ECDSA 256]---+
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |     +           |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |    * +          |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |   . O.. .       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |   .o+*.o o      |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |  ..O+B.S+       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | ..+o@ o..       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | .o=o .o+.       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |.o*. oE+. .      |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |o.oo..+ ..       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: SHA256:gl0VKYMEisXOu4aVsgQz15V+VKvhEo4KKuaN+eGTU08 root@np0005463148.novalocal
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +--[ED25519 256]--+
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | .. .o.o .+o     |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | o..  + +...     |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |.o.. o..oo.      |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |+ + .=.+.o       |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |o+ oo =.S        |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |oo+. . E         |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |+*.oo o          |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: |* B+.  .         |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: | =.+o            |
Sep 30 14:22:48 np0005463148.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Reached target Cloud-config availability.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Reached target Network is Online.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting System Logging Service...
Sep 30 14:22:48 np0005463148.novalocal sm-notify[1005]: Version 2.5.4 starting
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting OpenSSH server daemon...
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting Permit User Sessions...
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started Notify NFS peers of a restart.
Sep 30 14:22:48 np0005463148.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Sep 30 14:22:48 np0005463148.novalocal sshd[1007]: Server listening on :: port 22.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started OpenSSH server daemon.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Finished Permit User Sessions.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started Command Scheduler.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started Getty on tty1.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started Serial Getty on ttyS0.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Reached target Login Prompts.
Sep 30 14:22:48 np0005463148.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Sep 30 14:22:48 np0005463148.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Sep 30 14:22:48 np0005463148.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 81% if used.)
Sep 30 14:22:48 np0005463148.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Sep 30 14:22:48 np0005463148.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Sep 30 14:22:48 np0005463148.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Started System Logging Service.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Reached target Multi-User System.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Sep 30 14:22:48 np0005463148.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Sep 30 14:22:49 np0005463148.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1018]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 30 Sep 2025 14:22:49 +0000. Up 10.80 seconds.
Sep 30 14:22:49 np0005463148.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Sep 30 14:22:49 np0005463148.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1022]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 30 Sep 2025 14:22:49 +0000. Up 11.23 seconds.
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1024]: #############################################################
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1025]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1027]: 256 SHA256:yyBe5l7aB8UyOOIFNseSrc2qucOxok1c0i1zeKHrbY0 root@np0005463148.novalocal (ECDSA)
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1029]: 256 SHA256:gl0VKYMEisXOu4aVsgQz15V+VKvhEo4KKuaN+eGTU08 root@np0005463148.novalocal (ED25519)
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1031]: 3072 SHA256:nJJP9hX3xVRKYlXqoQ5YuLaStrgy/gRqp3ztVehf5dM root@np0005463148.novalocal (RSA)
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1032]: -----END SSH HOST KEY FINGERPRINTS-----
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1033]: #############################################################
Sep 30 14:22:49 np0005463148.novalocal cloud-init[1022]: Cloud-init v. 24.4-7.el9 finished at Tue, 30 Sep 2025 14:22:49 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.40 seconds
Sep 30 14:22:49 np0005463148.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Sep 30 14:22:49 np0005463148.novalocal systemd[1]: Reached target Cloud-init target.
Sep 30 14:22:49 np0005463148.novalocal systemd[1]: Startup finished in 1.530s (kernel) + 2.713s (initrd) + 7.218s (userspace) = 11.462s.
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1039]: Unable to negotiate with 38.102.83.114 port 60108: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1043]: Unable to negotiate with 38.102.83.114 port 60116: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1045]: Unable to negotiate with 38.102.83.114 port 60118: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1047]: Connection reset by 38.102.83.114 port 60130 [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1037]: Connection closed by 38.102.83.114 port 60098 [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1049]: Connection reset by 38.102.83.114 port 60144 [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1051]: Unable to negotiate with 38.102.83.114 port 60156: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1041]: Connection closed by 38.102.83.114 port 60112 [preauth]
Sep 30 14:22:51 np0005463148.novalocal sshd-session[1053]: Unable to negotiate with 38.102.83.114 port 60158: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 14:22:51 np0005463148.novalocal chronyd[803]: Selected source 142.4.192.253 (2.centos.pool.ntp.org)
Sep 30 14:22:51 np0005463148.novalocal chronyd[803]: System clock TAI offset set to 37 seconds
Sep 30 14:22:51 np0005463148.novalocal chronyd[803]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 25 affinity is now unmanaged
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 31 affinity is now unmanaged
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 28 affinity is now unmanaged
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 32 affinity is now unmanaged
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 30 affinity is now unmanaged
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Sep 30 14:22:55 np0005463148.novalocal irqbalance[784]: IRQ 29 affinity is now unmanaged
Sep 30 14:22:57 np0005463148.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 14:23:16 np0005463148.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 14:25:54 np0005463148.novalocal sshd-session[1057]: Invalid user  from 64.62.156.223 port 21995
Sep 30 14:25:57 np0005463148.novalocal sshd-session[1057]: Connection closed by invalid user  64.62.156.223 port 21995 [preauth]
Sep 30 14:28:04 np0005463148.novalocal sshd-session[1059]: Invalid user kevin from 185.156.73.233 port 41014
Sep 30 14:28:04 np0005463148.novalocal sshd-session[1059]: Connection closed by invalid user kevin 185.156.73.233 port 41014 [preauth]
Sep 30 14:37:16 np0005463148.novalocal sshd-session[1065]: Invalid user guest from 80.94.95.115 port 59442
Sep 30 14:37:17 np0005463148.novalocal sshd-session[1065]: Connection closed by invalid user guest 80.94.95.115 port 59442 [preauth]
Sep 30 14:38:00 np0005463148.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Sep 30 14:38:00 np0005463148.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Sep 30 14:38:00 np0005463148.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Sep 30 14:38:00 np0005463148.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Sep 30 14:38:05 np0005463148.novalocal sshd-session[1069]: Received disconnect from 103.26.136.173 port 48496:11: Bye Bye [preauth]
Sep 30 14:38:05 np0005463148.novalocal sshd-session[1069]: Disconnected from authenticating user root 103.26.136.173 port 48496 [preauth]
Sep 30 14:40:47 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 49.65.102.127 to 38.102.83.102, pid = 1071
Sep 30 14:41:12 np0005463148.novalocal sshd-session[1073]: Invalid user victor from 45.61.187.220 port 40142
Sep 30 14:41:12 np0005463148.novalocal sshd-session[1073]: Received disconnect from 45.61.187.220 port 40142:11: Bye Bye [preauth]
Sep 30 14:41:12 np0005463148.novalocal sshd-session[1073]: Disconnected from invalid user victor 45.61.187.220 port 40142 [preauth]
Sep 30 14:42:13 np0005463148.novalocal sshd-session[1075]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 57399 ssh2 [preauth]
Sep 30 14:42:13 np0005463148.novalocal sshd-session[1075]: Disconnecting authenticating user root 176.235.182.73 port 57399: Too many authentication failures [preauth]
Sep 30 14:42:15 np0005463148.novalocal sshd-session[1077]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 57933 ssh2 [preauth]
Sep 30 14:42:15 np0005463148.novalocal sshd-session[1077]: Disconnecting authenticating user root 176.235.182.73 port 57933: Too many authentication failures [preauth]
Sep 30 14:42:17 np0005463148.novalocal sshd-session[1079]: error: maximum authentication attempts exceeded for root from 176.235.182.73 port 58442 ssh2 [preauth]
Sep 30 14:42:17 np0005463148.novalocal sshd-session[1079]: Disconnecting authenticating user root 176.235.182.73 port 58442: Too many authentication failures [preauth]
Sep 30 14:42:19 np0005463148.novalocal sshd-session[1081]: Received disconnect from 176.235.182.73 port 58895:11: disconnected by user [preauth]
Sep 30 14:42:19 np0005463148.novalocal sshd-session[1081]: Disconnected from authenticating user root 176.235.182.73 port 58895 [preauth]
Sep 30 14:42:20 np0005463148.novalocal sshd-session[1083]: Invalid user admin from 176.235.182.73 port 59224
Sep 30 14:42:20 np0005463148.novalocal sshd-session[1083]: error: maximum authentication attempts exceeded for invalid user admin from 176.235.182.73 port 59224 ssh2 [preauth]
Sep 30 14:42:20 np0005463148.novalocal sshd-session[1083]: Disconnecting invalid user admin 176.235.182.73 port 59224: Too many authentication failures [preauth]
Sep 30 14:42:21 np0005463148.novalocal sshd-session[1085]: Invalid user jak from 103.26.136.173 port 47706
Sep 30 14:42:21 np0005463148.novalocal sshd-session[1085]: Received disconnect from 103.26.136.173 port 47706:11: Bye Bye [preauth]
Sep 30 14:42:21 np0005463148.novalocal sshd-session[1085]: Disconnected from invalid user jak 103.26.136.173 port 47706 [preauth]
Sep 30 14:42:21 np0005463148.novalocal sshd-session[1087]: Invalid user admin from 176.235.182.73 port 59647
Sep 30 14:42:22 np0005463148.novalocal sshd-session[1087]: error: maximum authentication attempts exceeded for invalid user admin from 176.235.182.73 port 59647 ssh2 [preauth]
Sep 30 14:42:22 np0005463148.novalocal sshd-session[1087]: Disconnecting invalid user admin 176.235.182.73 port 59647: Too many authentication failures [preauth]
Sep 30 14:42:23 np0005463148.novalocal sshd-session[1089]: Invalid user admin from 176.235.182.73 port 60155
Sep 30 14:42:24 np0005463148.novalocal sshd-session[1089]: Received disconnect from 176.235.182.73 port 60155:11: disconnected by user [preauth]
Sep 30 14:42:24 np0005463148.novalocal sshd-session[1089]: Disconnected from invalid user admin 176.235.182.73 port 60155 [preauth]
Sep 30 14:42:26 np0005463148.novalocal sshd-session[1091]: Invalid user oracle from 176.235.182.73 port 60639
Sep 30 14:42:26 np0005463148.novalocal sshd-session[1091]: error: maximum authentication attempts exceeded for invalid user oracle from 176.235.182.73 port 60639 ssh2 [preauth]
Sep 30 14:42:26 np0005463148.novalocal sshd-session[1091]: Disconnecting invalid user oracle 176.235.182.73 port 60639: Too many authentication failures [preauth]
Sep 30 14:42:28 np0005463148.novalocal sshd-session[1093]: Invalid user oracle from 176.235.182.73 port 32996
Sep 30 14:42:28 np0005463148.novalocal sshd-session[1093]: error: maximum authentication attempts exceeded for invalid user oracle from 176.235.182.73 port 32996 ssh2 [preauth]
Sep 30 14:42:28 np0005463148.novalocal sshd-session[1093]: Disconnecting invalid user oracle 176.235.182.73 port 32996: Too many authentication failures [preauth]
Sep 30 14:42:30 np0005463148.novalocal sshd-session[1095]: Invalid user oracle from 176.235.182.73 port 33485
Sep 30 14:42:30 np0005463148.novalocal sshd-session[1095]: Received disconnect from 176.235.182.73 port 33485:11: disconnected by user [preauth]
Sep 30 14:42:30 np0005463148.novalocal sshd-session[1095]: Disconnected from invalid user oracle 176.235.182.73 port 33485 [preauth]
Sep 30 14:42:31 np0005463148.novalocal sshd-session[1097]: Invalid user usuario from 176.235.182.73 port 33869
Sep 30 14:42:32 np0005463148.novalocal sshd-session[1097]: error: maximum authentication attempts exceeded for invalid user usuario from 176.235.182.73 port 33869 ssh2 [preauth]
Sep 30 14:42:32 np0005463148.novalocal sshd-session[1097]: Disconnecting invalid user usuario 176.235.182.73 port 33869: Too many authentication failures [preauth]
Sep 30 14:42:33 np0005463148.novalocal sshd-session[1099]: Invalid user usuario from 176.235.182.73 port 34364
Sep 30 14:42:34 np0005463148.novalocal sshd-session[1099]: error: maximum authentication attempts exceeded for invalid user usuario from 176.235.182.73 port 34364 ssh2 [preauth]
Sep 30 14:42:34 np0005463148.novalocal sshd-session[1099]: Disconnecting invalid user usuario 176.235.182.73 port 34364: Too many authentication failures [preauth]
Sep 30 14:42:35 np0005463148.novalocal sshd-session[1101]: Invalid user usuario from 176.235.182.73 port 34914
Sep 30 14:42:35 np0005463148.novalocal sshd-session[1101]: Received disconnect from 176.235.182.73 port 34914:11: disconnected by user [preauth]
Sep 30 14:42:35 np0005463148.novalocal sshd-session[1101]: Disconnected from invalid user usuario 176.235.182.73 port 34914 [preauth]
Sep 30 14:42:36 np0005463148.novalocal sshd-session[1103]: Invalid user test from 176.235.182.73 port 35213
Sep 30 14:42:37 np0005463148.novalocal sshd-session[1103]: error: maximum authentication attempts exceeded for invalid user test from 176.235.182.73 port 35213 ssh2 [preauth]
Sep 30 14:42:37 np0005463148.novalocal sshd-session[1103]: Disconnecting invalid user test 176.235.182.73 port 35213: Too many authentication failures [preauth]
Sep 30 14:42:39 np0005463148.novalocal sshd-session[1105]: Invalid user test from 176.235.182.73 port 35774
Sep 30 14:42:39 np0005463148.novalocal sshd-session[1105]: error: maximum authentication attempts exceeded for invalid user test from 176.235.182.73 port 35774 ssh2 [preauth]
Sep 30 14:42:39 np0005463148.novalocal sshd-session[1105]: Disconnecting invalid user test 176.235.182.73 port 35774: Too many authentication failures [preauth]
Sep 30 14:42:41 np0005463148.novalocal sshd-session[1107]: Invalid user test from 176.235.182.73 port 36335
Sep 30 14:42:41 np0005463148.novalocal sshd-session[1107]: Received disconnect from 176.235.182.73 port 36335:11: disconnected by user [preauth]
Sep 30 14:42:41 np0005463148.novalocal sshd-session[1107]: Disconnected from invalid user test 176.235.182.73 port 36335 [preauth]
Sep 30 14:42:42 np0005463148.novalocal sshd-session[1109]: Invalid user user from 176.235.182.73 port 36691
Sep 30 14:42:43 np0005463148.novalocal sshd-session[1109]: error: maximum authentication attempts exceeded for invalid user user from 176.235.182.73 port 36691 ssh2 [preauth]
Sep 30 14:42:43 np0005463148.novalocal sshd-session[1109]: Disconnecting invalid user user 176.235.182.73 port 36691: Too many authentication failures [preauth]
Sep 30 14:42:44 np0005463148.novalocal sshd-session[1111]: Invalid user user from 176.235.182.73 port 37312
Sep 30 14:42:45 np0005463148.novalocal sshd-session[1111]: error: maximum authentication attempts exceeded for invalid user user from 176.235.182.73 port 37312 ssh2 [preauth]
Sep 30 14:42:45 np0005463148.novalocal sshd-session[1111]: Disconnecting invalid user user 176.235.182.73 port 37312: Too many authentication failures [preauth]
Sep 30 14:42:46 np0005463148.novalocal sshd-session[1113]: Invalid user user from 176.235.182.73 port 37815
Sep 30 14:42:47 np0005463148.novalocal sshd-session[1113]: Received disconnect from 176.235.182.73 port 37815:11: disconnected by user [preauth]
Sep 30 14:42:47 np0005463148.novalocal sshd-session[1113]: Disconnected from invalid user user 176.235.182.73 port 37815 [preauth]
Sep 30 14:42:49 np0005463148.novalocal sshd-session[1115]: Invalid user ftpuser from 176.235.182.73 port 38220
Sep 30 14:42:50 np0005463148.novalocal sshd-session[1115]: error: maximum authentication attempts exceeded for invalid user ftpuser from 176.235.182.73 port 38220 ssh2 [preauth]
Sep 30 14:42:50 np0005463148.novalocal sshd-session[1115]: Disconnecting invalid user ftpuser 176.235.182.73 port 38220: Too many authentication failures [preauth]
Sep 30 14:42:51 np0005463148.novalocal sshd-session[1117]: Invalid user ftpuser from 176.235.182.73 port 39085
Sep 30 14:42:52 np0005463148.novalocal sshd-session[1117]: error: maximum authentication attempts exceeded for invalid user ftpuser from 176.235.182.73 port 39085 ssh2 [preauth]
Sep 30 14:42:52 np0005463148.novalocal sshd-session[1117]: Disconnecting invalid user ftpuser 176.235.182.73 port 39085: Too many authentication failures [preauth]
Sep 30 14:42:53 np0005463148.novalocal sshd-session[1119]: Invalid user ftpuser from 176.235.182.73 port 39599
Sep 30 14:42:54 np0005463148.novalocal sshd-session[1119]: Received disconnect from 176.235.182.73 port 39599:11: disconnected by user [preauth]
Sep 30 14:42:54 np0005463148.novalocal sshd-session[1119]: Disconnected from invalid user ftpuser 176.235.182.73 port 39599 [preauth]
Sep 30 14:42:55 np0005463148.novalocal sshd-session[1121]: Invalid user test1 from 176.235.182.73 port 40068
Sep 30 14:42:56 np0005463148.novalocal sshd-session[1121]: error: maximum authentication attempts exceeded for invalid user test1 from 176.235.182.73 port 40068 ssh2 [preauth]
Sep 30 14:42:56 np0005463148.novalocal sshd-session[1121]: Disconnecting invalid user test1 176.235.182.73 port 40068: Too many authentication failures [preauth]
Sep 30 14:42:57 np0005463148.novalocal sshd-session[1123]: Invalid user test1 from 176.235.182.73 port 40542
Sep 30 14:42:58 np0005463148.novalocal sshd-session[1123]: error: maximum authentication attempts exceeded for invalid user test1 from 176.235.182.73 port 40542 ssh2 [preauth]
Sep 30 14:42:58 np0005463148.novalocal sshd-session[1123]: Disconnecting invalid user test1 176.235.182.73 port 40542: Too many authentication failures [preauth]
Sep 30 14:42:59 np0005463148.novalocal sshd-session[1125]: Invalid user test1 from 176.235.182.73 port 41058
Sep 30 14:42:59 np0005463148.novalocal sshd-session[1125]: Received disconnect from 176.235.182.73 port 41058:11: disconnected by user [preauth]
Sep 30 14:42:59 np0005463148.novalocal sshd-session[1125]: Disconnected from invalid user test1 176.235.182.73 port 41058 [preauth]
Sep 30 14:43:00 np0005463148.novalocal sshd-session[1127]: Invalid user test2 from 176.235.182.73 port 41334
Sep 30 14:43:01 np0005463148.novalocal sshd-session[1127]: error: maximum authentication attempts exceeded for invalid user test2 from 176.235.182.73 port 41334 ssh2 [preauth]
Sep 30 14:43:01 np0005463148.novalocal sshd-session[1127]: Disconnecting invalid user test2 176.235.182.73 port 41334: Too many authentication failures [preauth]
Sep 30 14:43:02 np0005463148.novalocal sshd-session[1129]: Invalid user test2 from 176.235.182.73 port 41930
Sep 30 14:43:03 np0005463148.novalocal sshd-session[1129]: error: maximum authentication attempts exceeded for invalid user test2 from 176.235.182.73 port 41930 ssh2 [preauth]
Sep 30 14:43:03 np0005463148.novalocal sshd-session[1129]: Disconnecting invalid user test2 176.235.182.73 port 41930: Too many authentication failures [preauth]
Sep 30 14:43:04 np0005463148.novalocal sshd-session[1131]: Invalid user test2 from 176.235.182.73 port 42404
Sep 30 14:43:04 np0005463148.novalocal sshd-session[1131]: Received disconnect from 176.235.182.73 port 42404:11: disconnected by user [preauth]
Sep 30 14:43:04 np0005463148.novalocal sshd-session[1131]: Disconnected from invalid user test2 176.235.182.73 port 42404 [preauth]
Sep 30 14:43:06 np0005463148.novalocal sshd-session[1133]: Invalid user ubuntu from 176.235.182.73 port 42819
Sep 30 14:43:07 np0005463148.novalocal sshd-session[1133]: error: maximum authentication attempts exceeded for invalid user ubuntu from 176.235.182.73 port 42819 ssh2 [preauth]
Sep 30 14:43:07 np0005463148.novalocal sshd-session[1133]: Disconnecting invalid user ubuntu 176.235.182.73 port 42819: Too many authentication failures [preauth]
Sep 30 14:43:08 np0005463148.novalocal sshd-session[1135]: Invalid user ubuntu from 176.235.182.73 port 43387
Sep 30 14:43:09 np0005463148.novalocal sshd-session[1135]: error: maximum authentication attempts exceeded for invalid user ubuntu from 176.235.182.73 port 43387 ssh2 [preauth]
Sep 30 14:43:09 np0005463148.novalocal sshd-session[1135]: Disconnecting invalid user ubuntu 176.235.182.73 port 43387: Too many authentication failures [preauth]
Sep 30 14:43:10 np0005463148.novalocal sshd-session[1137]: Invalid user ubuntu from 176.235.182.73 port 43920
Sep 30 14:43:11 np0005463148.novalocal sshd-session[1137]: Received disconnect from 176.235.182.73 port 43920:11: disconnected by user [preauth]
Sep 30 14:43:11 np0005463148.novalocal sshd-session[1137]: Disconnected from invalid user ubuntu 176.235.182.73 port 43920 [preauth]
Sep 30 14:43:12 np0005463148.novalocal sshd-session[1139]: Invalid user pi from 176.235.182.73 port 44325
Sep 30 14:43:12 np0005463148.novalocal sshd-session[1139]: Received disconnect from 176.235.182.73 port 44325:11: disconnected by user [preauth]
Sep 30 14:43:12 np0005463148.novalocal sshd-session[1139]: Disconnected from invalid user pi 176.235.182.73 port 44325 [preauth]
Sep 30 14:43:13 np0005463148.novalocal sshd-session[1141]: Invalid user baikal from 176.235.182.73 port 44774
Sep 30 14:43:13 np0005463148.novalocal sshd-session[1141]: Received disconnect from 176.235.182.73 port 44774:11: disconnected by user [preauth]
Sep 30 14:43:13 np0005463148.novalocal sshd-session[1141]: Disconnected from invalid user baikal 176.235.182.73 port 44774 [preauth]
Sep 30 14:43:45 np0005463148.novalocal sshd-session[1143]: Invalid user test1 from 103.26.136.173 port 43528
Sep 30 14:43:45 np0005463148.novalocal sshd-session[1143]: Received disconnect from 103.26.136.173 port 43528:11: Bye Bye [preauth]
Sep 30 14:43:45 np0005463148.novalocal sshd-session[1143]: Disconnected from invalid user test1 103.26.136.173 port 43528 [preauth]
Sep 30 14:44:08 np0005463148.novalocal sshd-session[1145]: Invalid user superadmin from 45.61.187.220 port 56978
Sep 30 14:44:08 np0005463148.novalocal sshd-session[1145]: Received disconnect from 45.61.187.220 port 56978:11: Bye Bye [preauth]
Sep 30 14:44:08 np0005463148.novalocal sshd-session[1145]: Disconnected from invalid user superadmin 45.61.187.220 port 56978 [preauth]
Sep 30 14:45:03 np0005463148.novalocal sshd-session[1148]: Invalid user fabio from 103.26.136.173 port 39346
Sep 30 14:45:04 np0005463148.novalocal sshd-session[1148]: Received disconnect from 103.26.136.173 port 39346:11: Bye Bye [preauth]
Sep 30 14:45:04 np0005463148.novalocal sshd-session[1148]: Disconnected from invalid user fabio 103.26.136.173 port 39346 [preauth]
Sep 30 14:45:20 np0005463148.novalocal sshd-session[1150]: Invalid user odoo from 45.61.187.220 port 51480
Sep 30 14:45:20 np0005463148.novalocal sshd-session[1150]: Received disconnect from 45.61.187.220 port 51480:11: Bye Bye [preauth]
Sep 30 14:45:20 np0005463148.novalocal sshd-session[1150]: Disconnected from invalid user odoo 45.61.187.220 port 51480 [preauth]
Sep 30 14:46:23 np0005463148.novalocal sshd-session[1153]: Invalid user personal from 103.26.136.173 port 35168
Sep 30 14:46:23 np0005463148.novalocal sshd-session[1153]: Received disconnect from 103.26.136.173 port 35168:11: Bye Bye [preauth]
Sep 30 14:46:23 np0005463148.novalocal sshd-session[1153]: Disconnected from invalid user personal 103.26.136.173 port 35168 [preauth]
Sep 30 14:46:28 np0005463148.novalocal sshd-session[1155]: Received disconnect from 45.61.187.220 port 45982:11: Bye Bye [preauth]
Sep 30 14:46:28 np0005463148.novalocal sshd-session[1155]: Disconnected from authenticating user root 45.61.187.220 port 45982 [preauth]
Sep 30 14:46:34 np0005463148.novalocal sshd-session[1157]: Connection closed by authenticating user root 185.156.73.233 port 15788 [preauth]
Sep 30 14:47:31 np0005463148.novalocal sshd-session[1159]: Invalid user aryan from 45.61.187.220 port 40484
Sep 30 14:47:31 np0005463148.novalocal sshd-session[1159]: Received disconnect from 45.61.187.220 port 40484:11: Bye Bye [preauth]
Sep 30 14:47:31 np0005463148.novalocal sshd-session[1159]: Disconnected from invalid user aryan 45.61.187.220 port 40484 [preauth]
Sep 30 14:47:38 np0005463148.novalocal sshd-session[1161]: Invalid user test from 103.26.136.173 port 59218
Sep 30 14:47:39 np0005463148.novalocal sshd-session[1161]: Received disconnect from 103.26.136.173 port 59218:11: Bye Bye [preauth]
Sep 30 14:47:39 np0005463148.novalocal sshd-session[1161]: Disconnected from invalid user test 103.26.136.173 port 59218 [preauth]
Sep 30 14:48:34 np0005463148.novalocal sshd-session[1163]: Invalid user fauzi from 45.61.187.220 port 34986
Sep 30 14:48:34 np0005463148.novalocal sshd-session[1163]: Received disconnect from 45.61.187.220 port 34986:11: Bye Bye [preauth]
Sep 30 14:48:34 np0005463148.novalocal sshd-session[1163]: Disconnected from invalid user fauzi 45.61.187.220 port 34986 [preauth]
Sep 30 14:48:51 np0005463148.novalocal sshd-session[1165]: Received disconnect from 103.26.136.173 port 55026:11: Bye Bye [preauth]
Sep 30 14:48:51 np0005463148.novalocal sshd-session[1165]: Disconnected from authenticating user root 103.26.136.173 port 55026 [preauth]
Sep 30 14:49:37 np0005463148.novalocal sshd-session[1167]: Invalid user luis from 45.61.187.220 port 57722
Sep 30 14:49:37 np0005463148.novalocal sshd-session[1167]: Received disconnect from 45.61.187.220 port 57722:11: Bye Bye [preauth]
Sep 30 14:49:37 np0005463148.novalocal sshd-session[1167]: Disconnected from invalid user luis 45.61.187.220 port 57722 [preauth]
Sep 30 14:50:05 np0005463148.novalocal sshd-session[1170]: Received disconnect from 103.26.136.173 port 50834:11: Bye Bye [preauth]
Sep 30 14:50:05 np0005463148.novalocal sshd-session[1170]: Disconnected from authenticating user root 103.26.136.173 port 50834 [preauth]
Sep 30 14:50:44 np0005463148.novalocal sshd-session[1172]: Received disconnect from 45.61.187.220 port 52224:11: Bye Bye [preauth]
Sep 30 14:50:44 np0005463148.novalocal sshd-session[1172]: Disconnected from authenticating user root 45.61.187.220 port 52224 [preauth]
Sep 30 14:51:21 np0005463148.novalocal sshd-session[1175]: Invalid user odoo from 103.26.136.173 port 46648
Sep 30 14:51:21 np0005463148.novalocal sshd-session[1175]: Received disconnect from 103.26.136.173 port 46648:11: Bye Bye [preauth]
Sep 30 14:51:21 np0005463148.novalocal sshd-session[1175]: Disconnected from invalid user odoo 103.26.136.173 port 46648 [preauth]
Sep 30 14:51:52 np0005463148.novalocal sshd-session[1177]: Invalid user ha from 45.61.187.220 port 46726
Sep 30 14:51:52 np0005463148.novalocal sshd-session[1177]: Received disconnect from 45.61.187.220 port 46726:11: Bye Bye [preauth]
Sep 30 14:51:52 np0005463148.novalocal sshd-session[1177]: Disconnected from invalid user ha 45.61.187.220 port 46726 [preauth]
Sep 30 14:52:28 np0005463148.novalocal sshd-session[1180]: Invalid user pi from 77.181.207.70 port 63277
Sep 30 14:52:29 np0005463148.novalocal sshd-session[1180]: Connection closed by invalid user pi 77.181.207.70 port 63277 [preauth]
Sep 30 14:52:29 np0005463148.novalocal sshd-session[1182]: Invalid user pi from 77.181.207.70 port 62425
Sep 30 14:52:29 np0005463148.novalocal sshd-session[1182]: Connection closed by invalid user pi 77.181.207.70 port 62425 [preauth]
Sep 30 14:52:40 np0005463148.novalocal sshd-session[1184]: Invalid user mapr from 103.26.136.173 port 42466
Sep 30 14:52:40 np0005463148.novalocal sshd-session[1184]: Received disconnect from 103.26.136.173 port 42466:11: Bye Bye [preauth]
Sep 30 14:52:40 np0005463148.novalocal sshd-session[1184]: Disconnected from invalid user mapr 103.26.136.173 port 42466 [preauth]
Sep 30 14:52:56 np0005463148.novalocal sshd-session[1186]: Invalid user vignesh from 45.61.187.220 port 41230
Sep 30 14:52:56 np0005463148.novalocal sshd-session[1186]: Received disconnect from 45.61.187.220 port 41230:11: Bye Bye [preauth]
Sep 30 14:52:56 np0005463148.novalocal sshd-session[1186]: Disconnected from invalid user vignesh 45.61.187.220 port 41230 [preauth]
Sep 30 14:53:55 np0005463148.novalocal sshd-session[1188]: Invalid user contabilidad from 103.26.136.173 port 38280
Sep 30 14:53:55 np0005463148.novalocal sshd-session[1188]: Received disconnect from 103.26.136.173 port 38280:11: Bye Bye [preauth]
Sep 30 14:53:55 np0005463148.novalocal sshd-session[1188]: Disconnected from invalid user contabilidad 103.26.136.173 port 38280 [preauth]
Sep 30 14:53:57 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 123.56.220.219 to 38.102.83.102, pid = 1179
Sep 30 14:53:59 np0005463148.novalocal sshd-session[1190]: Received disconnect from 45.61.187.220 port 35732:11: Bye Bye [preauth]
Sep 30 14:53:59 np0005463148.novalocal sshd-session[1190]: Disconnected from authenticating user root 45.61.187.220 port 35732 [preauth]
Sep 30 14:55:04 np0005463148.novalocal sshd-session[1192]: Invalid user pwserver from 45.61.187.220 port 58466
Sep 30 14:55:04 np0005463148.novalocal sshd-session[1192]: Received disconnect from 45.61.187.220 port 58466:11: Bye Bye [preauth]
Sep 30 14:55:04 np0005463148.novalocal sshd-session[1192]: Disconnected from invalid user pwserver 45.61.187.220 port 58466 [preauth]
Sep 30 14:55:12 np0005463148.novalocal sshd-session[1194]: Invalid user wiki from 103.26.136.173 port 34092
Sep 30 14:55:12 np0005463148.novalocal sshd-session[1194]: Received disconnect from 103.26.136.173 port 34092:11: Bye Bye [preauth]
Sep 30 14:55:12 np0005463148.novalocal sshd-session[1194]: Disconnected from invalid user wiki 103.26.136.173 port 34092 [preauth]
Sep 30 14:56:11 np0005463148.novalocal sshd-session[1197]: Invalid user test1 from 45.61.187.220 port 52968
Sep 30 14:56:11 np0005463148.novalocal sshd-session[1197]: Received disconnect from 45.61.187.220 port 52968:11: Bye Bye [preauth]
Sep 30 14:56:11 np0005463148.novalocal sshd-session[1197]: Disconnected from invalid user test1 45.61.187.220 port 52968 [preauth]
Sep 30 14:56:31 np0005463148.novalocal sshd-session[1199]: Invalid user user33 from 103.26.136.173 port 58144
Sep 30 14:56:32 np0005463148.novalocal sshd-session[1199]: Received disconnect from 103.26.136.173 port 58144:11: Bye Bye [preauth]
Sep 30 14:56:32 np0005463148.novalocal sshd-session[1199]: Disconnected from invalid user user33 103.26.136.173 port 58144 [preauth]
Sep 30 14:56:44 np0005463148.novalocal sshd-session[1201]: Invalid user ubnt from 80.94.95.115 port 23844
Sep 30 14:56:44 np0005463148.novalocal sshd-session[1201]: Connection closed by invalid user ubnt 80.94.95.115 port 23844 [preauth]
Sep 30 14:57:19 np0005463148.novalocal sshd-session[1205]: Invalid user yoyo from 45.61.187.220 port 47470
Sep 30 14:57:20 np0005463148.novalocal sshd-session[1205]: Received disconnect from 45.61.187.220 port 47470:11: Bye Bye [preauth]
Sep 30 14:57:20 np0005463148.novalocal sshd-session[1205]: Disconnected from invalid user yoyo 45.61.187.220 port 47470 [preauth]
Sep 30 14:57:54 np0005463148.novalocal sshd-session[1207]: Invalid user samir from 103.26.136.173 port 53968
Sep 30 14:57:54 np0005463148.novalocal sshd-session[1207]: Received disconnect from 103.26.136.173 port 53968:11: Bye Bye [preauth]
Sep 30 14:57:54 np0005463148.novalocal sshd-session[1207]: Disconnected from invalid user samir 103.26.136.173 port 53968 [preauth]
Sep 30 14:58:32 np0005463148.novalocal sshd-session[1209]: Invalid user pre from 45.61.187.220 port 41972
Sep 30 14:58:32 np0005463148.novalocal sshd-session[1209]: Received disconnect from 45.61.187.220 port 41972:11: Bye Bye [preauth]
Sep 30 14:58:32 np0005463148.novalocal sshd-session[1209]: Disconnected from invalid user pre 45.61.187.220 port 41972 [preauth]
Sep 30 14:59:12 np0005463148.novalocal sshd-session[1211]: Received disconnect from 103.26.136.173 port 49784:11: Bye Bye [preauth]
Sep 30 14:59:12 np0005463148.novalocal sshd-session[1211]: Disconnected from authenticating user root 103.26.136.173 port 49784 [preauth]
Sep 30 14:59:44 np0005463148.novalocal sshd-session[1213]: Invalid user test from 45.61.187.220 port 36474
Sep 30 14:59:44 np0005463148.novalocal sshd-session[1213]: Received disconnect from 45.61.187.220 port 36474:11: Bye Bye [preauth]
Sep 30 14:59:44 np0005463148.novalocal sshd-session[1213]: Disconnected from invalid user test 45.61.187.220 port 36474 [preauth]
Sep 30 15:00:29 np0005463148.novalocal sshd-session[1215]: Invalid user forest from 103.26.136.173 port 45596
Sep 30 15:00:29 np0005463148.novalocal sshd-session[1215]: Received disconnect from 103.26.136.173 port 45596:11: Bye Bye [preauth]
Sep 30 15:00:29 np0005463148.novalocal sshd-session[1215]: Disconnected from invalid user forest 103.26.136.173 port 45596 [preauth]
Sep 30 15:00:48 np0005463148.novalocal sshd-session[1217]: Invalid user jinhan from 45.61.187.220 port 59208
Sep 30 15:00:48 np0005463148.novalocal sshd-session[1217]: Received disconnect from 45.61.187.220 port 59208:11: Bye Bye [preauth]
Sep 30 15:00:48 np0005463148.novalocal sshd-session[1217]: Disconnected from invalid user jinhan 45.61.187.220 port 59208 [preauth]
Sep 30 15:01:01 np0005463148.novalocal CROND[1221]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 15:01:01 np0005463148.novalocal run-parts[1224]: (/etc/cron.hourly) starting 0anacron
Sep 30 15:01:01 np0005463148.novalocal anacron[1232]: Anacron started on 2025-09-30
Sep 30 15:01:01 np0005463148.novalocal anacron[1232]: Will run job `cron.daily' in 42 min.
Sep 30 15:01:01 np0005463148.novalocal anacron[1232]: Will run job `cron.weekly' in 62 min.
Sep 30 15:01:01 np0005463148.novalocal anacron[1232]: Will run job `cron.monthly' in 82 min.
Sep 30 15:01:01 np0005463148.novalocal anacron[1232]: Jobs will be executed sequentially
Sep 30 15:01:02 np0005463148.novalocal run-parts[1234]: (/etc/cron.hourly) finished 0anacron
Sep 30 15:01:02 np0005463148.novalocal CROND[1220]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 15:01:44 np0005463148.novalocal sshd-session[1235]: Invalid user notes from 103.26.136.173 port 41412
Sep 30 15:01:44 np0005463148.novalocal sshd-session[1235]: Received disconnect from 103.26.136.173 port 41412:11: Bye Bye [preauth]
Sep 30 15:01:44 np0005463148.novalocal sshd-session[1235]: Disconnected from invalid user notes 103.26.136.173 port 41412 [preauth]
Sep 30 15:01:50 np0005463148.novalocal sshd-session[1237]: Received disconnect from 45.61.187.220 port 53710:11: Bye Bye [preauth]
Sep 30 15:01:50 np0005463148.novalocal sshd-session[1237]: Disconnected from authenticating user root 45.61.187.220 port 53710 [preauth]
Sep 30 15:02:56 np0005463148.novalocal sshd-session[1239]: Invalid user ecc from 45.61.187.220 port 48212
Sep 30 15:02:56 np0005463148.novalocal sshd-session[1239]: Received disconnect from 45.61.187.220 port 48212:11: Bye Bye [preauth]
Sep 30 15:02:56 np0005463148.novalocal sshd-session[1239]: Disconnected from invalid user ecc 45.61.187.220 port 48212 [preauth]
Sep 30 15:02:59 np0005463148.novalocal sshd-session[1241]: Received disconnect from 103.26.136.173 port 37228:11: Bye Bye [preauth]
Sep 30 15:02:59 np0005463148.novalocal sshd-session[1241]: Disconnected from authenticating user root 103.26.136.173 port 37228 [preauth]
Sep 30 15:03:04 np0005463148.novalocal sshd-session[1243]: Connection closed by authenticating user root 185.156.73.233 port 23868 [preauth]
Sep 30 15:04:07 np0005463148.novalocal sshd-session[1245]: Invalid user choi from 45.61.187.220 port 42714
Sep 30 15:04:07 np0005463148.novalocal sshd-session[1245]: Received disconnect from 45.61.187.220 port 42714:11: Bye Bye [preauth]
Sep 30 15:04:07 np0005463148.novalocal sshd-session[1245]: Disconnected from invalid user choi 45.61.187.220 port 42714 [preauth]
Sep 30 15:04:21 np0005463148.novalocal sshd-session[1247]: Invalid user foundry from 103.26.136.173 port 33046
Sep 30 15:04:21 np0005463148.novalocal sshd-session[1247]: Received disconnect from 103.26.136.173 port 33046:11: Bye Bye [preauth]
Sep 30 15:04:21 np0005463148.novalocal sshd-session[1247]: Disconnected from invalid user foundry 103.26.136.173 port 33046 [preauth]
Sep 30 15:05:10 np0005463148.novalocal sshd-session[1249]: Invalid user user1 from 45.61.187.220 port 37216
Sep 30 15:05:10 np0005463148.novalocal sshd-session[1249]: Received disconnect from 45.61.187.220 port 37216:11: Bye Bye [preauth]
Sep 30 15:05:10 np0005463148.novalocal sshd-session[1249]: Disconnected from invalid user user1 45.61.187.220 port 37216 [preauth]
Sep 30 15:05:36 np0005463148.novalocal sshd-session[1251]: Received disconnect from 103.26.136.173 port 57086:11: Bye Bye [preauth]
Sep 30 15:05:36 np0005463148.novalocal sshd-session[1251]: Disconnected from authenticating user root 103.26.136.173 port 57086 [preauth]
Sep 30 15:06:14 np0005463148.novalocal sshd-session[1253]: Invalid user minecraft from 45.61.187.220 port 59950
Sep 30 15:06:14 np0005463148.novalocal sshd-session[1253]: Received disconnect from 45.61.187.220 port 59950:11: Bye Bye [preauth]
Sep 30 15:06:14 np0005463148.novalocal sshd-session[1253]: Disconnected from invalid user minecraft 45.61.187.220 port 59950 [preauth]
Sep 30 15:06:50 np0005463148.novalocal sshd-session[1256]: Invalid user test3 from 103.26.136.173 port 52900
Sep 30 15:06:50 np0005463148.novalocal sshd-session[1256]: Received disconnect from 103.26.136.173 port 52900:11: Bye Bye [preauth]
Sep 30 15:06:50 np0005463148.novalocal sshd-session[1256]: Disconnected from invalid user test3 103.26.136.173 port 52900 [preauth]
Sep 30 15:07:16 np0005463148.novalocal sshd-session[1258]: Invalid user jhall from 45.61.187.220 port 54452
Sep 30 15:07:16 np0005463148.novalocal sshd-session[1258]: Received disconnect from 45.61.187.220 port 54452:11: Bye Bye [preauth]
Sep 30 15:07:16 np0005463148.novalocal sshd-session[1258]: Disconnected from invalid user jhall 45.61.187.220 port 54452 [preauth]
Sep 30 15:08:03 np0005463148.novalocal sshd-session[1260]: Invalid user minecraft from 103.26.136.173 port 48714
Sep 30 15:08:03 np0005463148.novalocal sshd-session[1260]: Received disconnect from 103.26.136.173 port 48714:11: Bye Bye [preauth]
Sep 30 15:08:03 np0005463148.novalocal sshd-session[1260]: Disconnected from invalid user minecraft 103.26.136.173 port 48714 [preauth]
Sep 30 15:08:19 np0005463148.novalocal sshd-session[1265]: Invalid user ubuntu from 45.61.187.220 port 48954
Sep 30 15:08:19 np0005463148.novalocal sshd-session[1265]: Received disconnect from 45.61.187.220 port 48954:11: Bye Bye [preauth]
Sep 30 15:08:19 np0005463148.novalocal sshd-session[1265]: Disconnected from invalid user ubuntu 45.61.187.220 port 48954 [preauth]
Sep 30 15:08:34 np0005463148.novalocal sshd-session[1263]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 15:08:34 np0005463148.novalocal sshd-session[1263]: Connection reset by 45.140.17.97 port 4734
Sep 30 15:09:23 np0005463148.novalocal sshd-session[1268]: Invalid user sammy from 103.26.136.173 port 44530
Sep 30 15:09:23 np0005463148.novalocal sshd-session[1268]: Received disconnect from 103.26.136.173 port 44530:11: Bye Bye [preauth]
Sep 30 15:09:23 np0005463148.novalocal sshd-session[1268]: Disconnected from invalid user sammy 103.26.136.173 port 44530 [preauth]
Sep 30 15:09:25 np0005463148.novalocal sshd-session[1270]: Invalid user foundry from 45.61.187.220 port 43456
Sep 30 15:09:25 np0005463148.novalocal sshd-session[1270]: Received disconnect from 45.61.187.220 port 43456:11: Bye Bye [preauth]
Sep 30 15:09:25 np0005463148.novalocal sshd-session[1270]: Disconnected from invalid user foundry 45.61.187.220 port 43456 [preauth]
Sep 30 15:10:36 np0005463148.novalocal sshd-session[1272]: Invalid user calibre from 45.61.187.220 port 37958
Sep 30 15:10:36 np0005463148.novalocal sshd-session[1272]: Received disconnect from 45.61.187.220 port 37958:11: Bye Bye [preauth]
Sep 30 15:10:36 np0005463148.novalocal sshd-session[1272]: Disconnected from invalid user calibre 45.61.187.220 port 37958 [preauth]
Sep 30 15:10:39 np0005463148.novalocal sshd-session[1274]: Invalid user robot from 103.26.136.173 port 40348
Sep 30 15:10:40 np0005463148.novalocal sshd-session[1274]: Received disconnect from 103.26.136.173 port 40348:11: Bye Bye [preauth]
Sep 30 15:10:40 np0005463148.novalocal sshd-session[1274]: Disconnected from invalid user robot 103.26.136.173 port 40348 [preauth]
Sep 30 15:11:46 np0005463148.novalocal sshd-session[1276]: Received disconnect from 45.61.187.220 port 60692:11: Bye Bye [preauth]
Sep 30 15:11:46 np0005463148.novalocal sshd-session[1276]: Disconnected from authenticating user root 45.61.187.220 port 60692 [preauth]
Sep 30 15:11:57 np0005463148.novalocal sshd-session[1279]: Received disconnect from 103.26.136.173 port 36164:11: Bye Bye [preauth]
Sep 30 15:11:57 np0005463148.novalocal sshd-session[1279]: Disconnected from authenticating user root 103.26.136.173 port 36164 [preauth]
Sep 30 15:12:50 np0005463148.novalocal systemd[1]: Starting dnf makecache...
Sep 30 15:12:50 np0005463148.novalocal dnf[1282]: Failed determining last makecache time.
Sep 30 15:12:51 np0005463148.novalocal dnf[1282]: CentOS Stream 9 - BaseOS                         35 kB/s | 7.0 kB     00:00
Sep 30 15:12:51 np0005463148.novalocal dnf[1282]: CentOS Stream 9 - AppStream                      26 kB/s | 7.1 kB     00:00
Sep 30 15:12:52 np0005463148.novalocal dnf[1282]: CentOS Stream 9 - CRB                            28 kB/s | 6.9 kB     00:00
Sep 30 15:12:52 np0005463148.novalocal dnf[1282]: CentOS Stream 9 - Extras packages                50 kB/s | 8.0 kB     00:00
Sep 30 15:12:52 np0005463148.novalocal dnf[1282]: Metadata cache created.
Sep 30 15:12:52 np0005463148.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 15:12:52 np0005463148.novalocal systemd[1]: Finished dnf makecache.
Sep 30 15:12:55 np0005463148.novalocal sshd-session[1291]: Invalid user qadmin from 45.61.187.220 port 55194
Sep 30 15:12:55 np0005463148.novalocal sshd-session[1291]: Received disconnect from 45.61.187.220 port 55194:11: Bye Bye [preauth]
Sep 30 15:12:55 np0005463148.novalocal sshd-session[1291]: Disconnected from invalid user qadmin 45.61.187.220 port 55194 [preauth]
Sep 30 15:13:06 np0005463148.novalocal sshd-session[1293]: Received disconnect from 23.137.255.140 port 6354:11: Bye Bye [preauth]
Sep 30 15:13:06 np0005463148.novalocal sshd-session[1293]: Disconnected from authenticating user root 23.137.255.140 port 6354 [preauth]
Sep 30 15:13:06 np0005463148.novalocal sshd-session[1294]: Connection closed by authenticating user root 185.156.73.233 port 27966 [preauth]
Sep 30 15:13:13 np0005463148.novalocal sshd-session[1297]: Received disconnect from 103.26.136.173 port 60210:11: Bye Bye [preauth]
Sep 30 15:13:13 np0005463148.novalocal sshd-session[1297]: Disconnected from authenticating user root 103.26.136.173 port 60210 [preauth]
Sep 30 15:13:58 np0005463148.novalocal sshd-session[1299]: Received disconnect from 45.61.187.220 port 49696:11: Bye Bye [preauth]
Sep 30 15:13:58 np0005463148.novalocal sshd-session[1299]: Disconnected from authenticating user root 45.61.187.220 port 49696 [preauth]
Sep 30 15:14:27 np0005463148.novalocal sshd-session[1302]: Invalid user rajeev from 103.26.136.173 port 56020
Sep 30 15:14:27 np0005463148.novalocal sshd-session[1302]: Received disconnect from 103.26.136.173 port 56020:11: Bye Bye [preauth]
Sep 30 15:14:27 np0005463148.novalocal sshd-session[1302]: Disconnected from invalid user rajeev 103.26.136.173 port 56020 [preauth]
Sep 30 15:15:01 np0005463148.novalocal sshd-session[1305]: Invalid user applmgr from 45.61.187.220 port 44198
Sep 30 15:15:01 np0005463148.novalocal sshd-session[1305]: Received disconnect from 45.61.187.220 port 44198:11: Bye Bye [preauth]
Sep 30 15:15:01 np0005463148.novalocal sshd-session[1305]: Disconnected from invalid user applmgr 45.61.187.220 port 44198 [preauth]
Sep 30 15:15:44 np0005463148.novalocal sshd-session[1307]: Invalid user amit from 103.26.136.173 port 51836
Sep 30 15:15:44 np0005463148.novalocal sshd-session[1307]: Received disconnect from 103.26.136.173 port 51836:11: Bye Bye [preauth]
Sep 30 15:15:44 np0005463148.novalocal sshd-session[1307]: Disconnected from invalid user amit 103.26.136.173 port 51836 [preauth]
Sep 30 15:16:07 np0005463148.novalocal sshd-session[1309]: Received disconnect from 45.61.187.220 port 38700:11: Bye Bye [preauth]
Sep 30 15:16:07 np0005463148.novalocal sshd-session[1309]: Disconnected from authenticating user root 45.61.187.220 port 38700 [preauth]
Sep 30 15:17:02 np0005463148.novalocal sshd-session[1311]: Invalid user tv from 103.26.136.173 port 47658
Sep 30 15:17:03 np0005463148.novalocal sshd-session[1311]: Received disconnect from 103.26.136.173 port 47658:11: Bye Bye [preauth]
Sep 30 15:17:03 np0005463148.novalocal sshd-session[1311]: Disconnected from invalid user tv 103.26.136.173 port 47658 [preauth]
Sep 30 15:17:15 np0005463148.novalocal sshd-session[1313]: Received disconnect from 45.61.187.220 port 33202:11: Bye Bye [preauth]
Sep 30 15:17:15 np0005463148.novalocal sshd-session[1313]: Disconnected from authenticating user root 45.61.187.220 port 33202 [preauth]
Sep 30 15:18:19 np0005463148.novalocal sshd-session[1316]: Invalid user webserver from 45.61.187.220 port 55936
Sep 30 15:18:19 np0005463148.novalocal sshd-session[1316]: Received disconnect from 45.61.187.220 port 55936:11: Bye Bye [preauth]
Sep 30 15:18:19 np0005463148.novalocal sshd-session[1316]: Disconnected from invalid user webserver 45.61.187.220 port 55936 [preauth]
Sep 30 15:18:21 np0005463148.novalocal sshd-session[1318]: Invalid user supports from 103.26.136.173 port 43474
Sep 30 15:18:22 np0005463148.novalocal sshd-session[1318]: Received disconnect from 103.26.136.173 port 43474:11: Bye Bye [preauth]
Sep 30 15:18:22 np0005463148.novalocal sshd-session[1318]: Disconnected from invalid user supports 103.26.136.173 port 43474 [preauth]
Sep 30 15:19:26 np0005463148.novalocal sshd-session[1321]: Invalid user send from 45.61.187.220 port 50442
Sep 30 15:19:26 np0005463148.novalocal sshd-session[1321]: Received disconnect from 45.61.187.220 port 50442:11: Bye Bye [preauth]
Sep 30 15:19:26 np0005463148.novalocal sshd-session[1321]: Disconnected from invalid user send 45.61.187.220 port 50442 [preauth]
Sep 30 15:19:36 np0005463148.novalocal sshd-session[1323]: Invalid user x from 103.26.136.173 port 39292
Sep 30 15:19:37 np0005463148.novalocal sshd-session[1323]: Received disconnect from 103.26.136.173 port 39292:11: Bye Bye [preauth]
Sep 30 15:19:37 np0005463148.novalocal sshd-session[1323]: Disconnected from invalid user x 103.26.136.173 port 39292 [preauth]
Sep 30 15:20:28 np0005463148.novalocal sshd-session[1325]: Invalid user oracle from 45.61.187.220 port 44944
Sep 30 15:20:28 np0005463148.novalocal sshd-session[1325]: Received disconnect from 45.61.187.220 port 44944:11: Bye Bye [preauth]
Sep 30 15:20:28 np0005463148.novalocal sshd-session[1325]: Disconnected from invalid user oracle 45.61.187.220 port 44944 [preauth]
Sep 30 15:20:52 np0005463148.novalocal sshd-session[1327]: Received disconnect from 103.26.136.173 port 35102:11: Bye Bye [preauth]
Sep 30 15:20:52 np0005463148.novalocal sshd-session[1327]: Disconnected from authenticating user root 103.26.136.173 port 35102 [preauth]
Sep 30 15:20:52 np0005463148.novalocal sshd-session[1329]: Received disconnect from 103.57.64.214 port 40038:11: Bye Bye [preauth]
Sep 30 15:20:52 np0005463148.novalocal sshd-session[1329]: Disconnected from authenticating user root 103.57.64.214 port 40038 [preauth]
Sep 30 15:21:33 np0005463148.novalocal sshd-session[1331]: Invalid user roger from 45.61.187.220 port 39446
Sep 30 15:21:33 np0005463148.novalocal sshd-session[1331]: Received disconnect from 45.61.187.220 port 39446:11: Bye Bye [preauth]
Sep 30 15:21:33 np0005463148.novalocal sshd-session[1331]: Disconnected from invalid user roger 45.61.187.220 port 39446 [preauth]
Sep 30 15:22:07 np0005463148.novalocal sshd-session[1333]: Invalid user sysadmin from 78.39.48.166 port 16741
Sep 30 15:22:07 np0005463148.novalocal sshd-session[1333]: Received disconnect from 78.39.48.166 port 16741:11: Bye Bye [preauth]
Sep 30 15:22:07 np0005463148.novalocal sshd-session[1333]: Disconnected from invalid user sysadmin 78.39.48.166 port 16741 [preauth]
Sep 30 15:22:08 np0005463148.novalocal sshd-session[1335]: Invalid user eren from 103.26.136.173 port 59154
Sep 30 15:22:09 np0005463148.novalocal sshd-session[1335]: Received disconnect from 103.26.136.173 port 59154:11: Bye Bye [preauth]
Sep 30 15:22:09 np0005463148.novalocal sshd-session[1335]: Disconnected from invalid user eren 103.26.136.173 port 59154 [preauth]
Sep 30 15:22:45 np0005463148.novalocal sshd-session[1338]: Invalid user rob from 45.61.187.220 port 33948
Sep 30 15:22:45 np0005463148.novalocal sshd-session[1338]: Received disconnect from 45.61.187.220 port 33948:11: Bye Bye [preauth]
Sep 30 15:22:45 np0005463148.novalocal sshd-session[1338]: Disconnected from invalid user rob 45.61.187.220 port 33948 [preauth]
Sep 30 15:22:58 np0005463148.novalocal sshd-session[1341]: Connection closed by authenticating user root 80.94.95.116 port 26272 [preauth]
Sep 30 15:23:27 np0005463148.novalocal sshd-session[1344]: Invalid user superadmin from 103.26.136.173 port 54968
Sep 30 15:23:27 np0005463148.novalocal sshd-session[1344]: Received disconnect from 103.26.136.173 port 54968:11: Bye Bye [preauth]
Sep 30 15:23:27 np0005463148.novalocal sshd-session[1344]: Disconnected from invalid user superadmin 103.26.136.173 port 54968 [preauth]
Sep 30 15:24:44 np0005463148.novalocal sshd-session[1346]: Invalid user rick from 103.26.136.173 port 50790
Sep 30 15:24:45 np0005463148.novalocal sshd-session[1346]: Received disconnect from 103.26.136.173 port 50790:11: Bye Bye [preauth]
Sep 30 15:24:45 np0005463148.novalocal sshd-session[1346]: Disconnected from invalid user rick 103.26.136.173 port 50790 [preauth]
Sep 30 15:25:00 np0005463148.novalocal sshd-session[1348]: Received disconnect from 78.39.48.166 port 19656:11: Bye Bye [preauth]
Sep 30 15:25:00 np0005463148.novalocal sshd-session[1348]: Disconnected from authenticating user root 78.39.48.166 port 19656 [preauth]
Sep 30 15:25:16 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 153.101.132.65 to 38.102.83.102, pid = 1343
Sep 30 15:25:20 np0005463148.novalocal sshd-session[1350]: Invalid user vncuser from 103.57.64.214 port 38370
Sep 30 15:25:20 np0005463148.novalocal sshd-session[1350]: Received disconnect from 103.57.64.214 port 38370:11: Bye Bye [preauth]
Sep 30 15:25:20 np0005463148.novalocal sshd-session[1350]: Disconnected from invalid user vncuser 103.57.64.214 port 38370 [preauth]
Sep 30 15:26:02 np0005463148.novalocal sshd-session[1352]: Invalid user ftpuser1 from 103.26.136.173 port 46606
Sep 30 15:26:02 np0005463148.novalocal sshd-session[1352]: Received disconnect from 103.26.136.173 port 46606:11: Bye Bye [preauth]
Sep 30 15:26:02 np0005463148.novalocal sshd-session[1352]: Disconnected from invalid user ftpuser1 103.26.136.173 port 46606 [preauth]
Sep 30 15:26:04 np0005463148.novalocal sshd-session[1354]: Invalid user wangyao from 78.39.48.166 port 26321
Sep 30 15:26:04 np0005463148.novalocal sshd-session[1354]: Received disconnect from 78.39.48.166 port 26321:11: Bye Bye [preauth]
Sep 30 15:26:04 np0005463148.novalocal sshd-session[1354]: Disconnected from invalid user wangyao 78.39.48.166 port 26321 [preauth]
Sep 30 15:26:40 np0005463148.novalocal sshd-session[1356]: Invalid user zc from 103.57.64.214 port 40622
Sep 30 15:26:40 np0005463148.novalocal sshd-session[1356]: Received disconnect from 103.57.64.214 port 40622:11: Bye Bye [preauth]
Sep 30 15:26:40 np0005463148.novalocal sshd-session[1356]: Disconnected from invalid user zc 103.57.64.214 port 40622 [preauth]
Sep 30 15:27:04 np0005463148.novalocal sshd-session[1358]: Invalid user wow from 78.39.48.166 port 58611
Sep 30 15:27:04 np0005463148.novalocal sshd-session[1358]: Received disconnect from 78.39.48.166 port 58611:11: Bye Bye [preauth]
Sep 30 15:27:04 np0005463148.novalocal sshd-session[1358]: Disconnected from invalid user wow 78.39.48.166 port 58611 [preauth]
Sep 30 15:27:16 np0005463148.novalocal sshd-session[1360]: Received disconnect from 103.26.136.173 port 42424:11: Bye Bye [preauth]
Sep 30 15:27:16 np0005463148.novalocal sshd-session[1360]: Disconnected from authenticating user root 103.26.136.173 port 42424 [preauth]
Sep 30 15:27:54 np0005463148.novalocal sshd-session[1362]: Received disconnect from 103.57.64.214 port 49010:11: Bye Bye [preauth]
Sep 30 15:27:54 np0005463148.novalocal sshd-session[1362]: Disconnected from authenticating user root 103.57.64.214 port 49010 [preauth]
Sep 30 15:28:03 np0005463148.novalocal sshd-session[1364]: Received disconnect from 78.39.48.166 port 6641:11: Bye Bye [preauth]
Sep 30 15:28:03 np0005463148.novalocal sshd-session[1364]: Disconnected from authenticating user root 78.39.48.166 port 6641 [preauth]
Sep 30 15:29:04 np0005463148.novalocal sshd-session[1367]: Received disconnect from 78.39.48.166 port 42394:11: Bye Bye [preauth]
Sep 30 15:29:04 np0005463148.novalocal sshd-session[1367]: Disconnected from authenticating user root 78.39.48.166 port 42394 [preauth]
Sep 30 15:29:09 np0005463148.novalocal sshd-session[1369]: Invalid user zak from 103.57.64.214 port 42402
Sep 30 15:29:09 np0005463148.novalocal sshd-session[1369]: Received disconnect from 103.57.64.214 port 42402:11: Bye Bye [preauth]
Sep 30 15:29:09 np0005463148.novalocal sshd-session[1369]: Disconnected from invalid user zak 103.57.64.214 port 42402 [preauth]
Sep 30 15:30:05 np0005463148.novalocal sshd-session[1374]: Invalid user dmdba from 78.39.48.166 port 36274
Sep 30 15:30:05 np0005463148.novalocal sshd-session[1374]: Received disconnect from 78.39.48.166 port 36274:11: Bye Bye [preauth]
Sep 30 15:30:05 np0005463148.novalocal sshd-session[1374]: Disconnected from invalid user dmdba 78.39.48.166 port 36274 [preauth]
Sep 30 15:30:27 np0005463148.novalocal sshd-session[1376]: Received disconnect from 103.57.64.214 port 50230:11: Bye Bye [preauth]
Sep 30 15:30:27 np0005463148.novalocal sshd-session[1376]: Disconnected from authenticating user root 103.57.64.214 port 50230 [preauth]
Sep 30 15:31:04 np0005463148.novalocal sshd-session[1378]: Invalid user es from 78.39.48.166 port 55383
Sep 30 15:31:04 np0005463148.novalocal sshd-session[1378]: Received disconnect from 78.39.48.166 port 55383:11: Bye Bye [preauth]
Sep 30 15:31:04 np0005463148.novalocal sshd-session[1378]: Disconnected from invalid user es 78.39.48.166 port 55383 [preauth]
Sep 30 15:31:38 np0005463148.novalocal sshd-session[1380]: Connection closed by authenticating user root 185.156.73.233 port 56720 [preauth]
Sep 30 15:31:41 np0005463148.novalocal sshd-session[1382]: Invalid user wangchangyou from 103.57.64.214 port 58578
Sep 30 15:31:41 np0005463148.novalocal sshd-session[1382]: Received disconnect from 103.57.64.214 port 58578:11: Bye Bye [preauth]
Sep 30 15:31:41 np0005463148.novalocal sshd-session[1382]: Disconnected from invalid user wangchangyou 103.57.64.214 port 58578 [preauth]
Sep 30 15:32:02 np0005463148.novalocal sshd-session[1384]: Invalid user sopuser from 78.39.48.166 port 41160
Sep 30 15:32:02 np0005463148.novalocal sshd-session[1384]: Received disconnect from 78.39.48.166 port 41160:11: Bye Bye [preauth]
Sep 30 15:32:02 np0005463148.novalocal sshd-session[1384]: Disconnected from invalid user sopuser 78.39.48.166 port 41160 [preauth]
Sep 30 15:32:56 np0005463148.novalocal sshd-session[1386]: Invalid user bacon from 103.57.64.214 port 56890
Sep 30 15:32:56 np0005463148.novalocal sshd-session[1386]: Received disconnect from 103.57.64.214 port 56890:11: Bye Bye [preauth]
Sep 30 15:32:56 np0005463148.novalocal sshd-session[1386]: Disconnected from invalid user bacon 103.57.64.214 port 56890 [preauth]
Sep 30 15:33:00 np0005463148.novalocal sshd-session[1388]: Received disconnect from 78.39.48.166 port 61640:11: Bye Bye [preauth]
Sep 30 15:33:00 np0005463148.novalocal sshd-session[1388]: Disconnected from authenticating user root 78.39.48.166 port 61640 [preauth]
Sep 30 15:33:56 np0005463148.novalocal sshd-session[1391]: Invalid user wizard from 78.39.48.166 port 5292
Sep 30 15:33:56 np0005463148.novalocal sshd-session[1391]: Received disconnect from 78.39.48.166 port 5292:11: Bye Bye [preauth]
Sep 30 15:33:56 np0005463148.novalocal sshd-session[1391]: Disconnected from invalid user wizard 78.39.48.166 port 5292 [preauth]
Sep 30 15:34:09 np0005463148.novalocal sshd-session[1394]: Invalid user asa from 103.57.64.214 port 58090
Sep 30 15:34:09 np0005463148.novalocal sshd-session[1394]: Received disconnect from 103.57.64.214 port 58090:11: Bye Bye [preauth]
Sep 30 15:34:09 np0005463148.novalocal sshd-session[1394]: Disconnected from invalid user asa 103.57.64.214 port 58090 [preauth]
Sep 30 15:34:54 np0005463148.novalocal sshd-session[1396]: Invalid user dot from 78.39.48.166 port 63233
Sep 30 15:34:54 np0005463148.novalocal sshd-session[1396]: Received disconnect from 78.39.48.166 port 63233:11: Bye Bye [preauth]
Sep 30 15:34:54 np0005463148.novalocal sshd-session[1396]: Disconnected from invalid user dot 78.39.48.166 port 63233 [preauth]
Sep 30 15:35:24 np0005463148.novalocal sshd-session[1399]: Invalid user nfsuser from 103.57.64.214 port 52340
Sep 30 15:35:24 np0005463148.novalocal sshd-session[1399]: Received disconnect from 103.57.64.214 port 52340:11: Bye Bye [preauth]
Sep 30 15:35:24 np0005463148.novalocal sshd-session[1399]: Disconnected from invalid user nfsuser 103.57.64.214 port 52340 [preauth]
Sep 30 15:35:53 np0005463148.novalocal sshd-session[1401]: Received disconnect from 78.39.48.166 port 50695:11: Bye Bye [preauth]
Sep 30 15:35:53 np0005463148.novalocal sshd-session[1401]: Disconnected from authenticating user root 78.39.48.166 port 50695 [preauth]
Sep 30 15:36:41 np0005463148.novalocal sshd-session[1403]: Received disconnect from 103.57.64.214 port 58720:11: Bye Bye [preauth]
Sep 30 15:36:41 np0005463148.novalocal sshd-session[1403]: Disconnected from authenticating user root 103.57.64.214 port 58720 [preauth]
Sep 30 15:36:55 np0005463148.novalocal sshd-session[1405]: Invalid user group1 from 78.39.48.166 port 1214
Sep 30 15:36:56 np0005463148.novalocal sshd-session[1405]: Received disconnect from 78.39.48.166 port 1214:11: Bye Bye [preauth]
Sep 30 15:36:56 np0005463148.novalocal sshd-session[1405]: Disconnected from invalid user group1 78.39.48.166 port 1214 [preauth]
Sep 30 15:37:56 np0005463148.novalocal sshd-session[1407]: Invalid user pivpn from 78.39.48.166 port 58329
Sep 30 15:37:56 np0005463148.novalocal sshd-session[1407]: Received disconnect from 78.39.48.166 port 58329:11: Bye Bye [preauth]
Sep 30 15:37:56 np0005463148.novalocal sshd-session[1407]: Disconnected from invalid user pivpn 78.39.48.166 port 58329 [preauth]
Sep 30 15:37:57 np0005463148.novalocal sshd-session[1409]: Invalid user harry from 103.57.64.214 port 43698
Sep 30 15:37:57 np0005463148.novalocal sshd-session[1409]: Received disconnect from 103.57.64.214 port 43698:11: Bye Bye [preauth]
Sep 30 15:37:57 np0005463148.novalocal sshd-session[1409]: Disconnected from invalid user harry 103.57.64.214 port 43698 [preauth]
Sep 30 15:38:55 np0005463148.novalocal sshd-session[1411]: Invalid user test123 from 78.39.48.166 port 50285
Sep 30 15:38:55 np0005463148.novalocal sshd-session[1411]: Received disconnect from 78.39.48.166 port 50285:11: Bye Bye [preauth]
Sep 30 15:38:55 np0005463148.novalocal sshd-session[1411]: Disconnected from invalid user test123 78.39.48.166 port 50285 [preauth]
Sep 30 15:39:15 np0005463148.novalocal sshd-session[1414]: Invalid user superadmin from 103.57.64.214 port 54244
Sep 30 15:39:16 np0005463148.novalocal sshd-session[1414]: Received disconnect from 103.57.64.214 port 54244:11: Bye Bye [preauth]
Sep 30 15:39:16 np0005463148.novalocal sshd-session[1414]: Disconnected from invalid user superadmin 103.57.64.214 port 54244 [preauth]
Sep 30 15:39:53 np0005463148.novalocal sshd-session[1416]: Invalid user foundry from 78.39.48.166 port 8861
Sep 30 15:39:53 np0005463148.novalocal sshd-session[1416]: Received disconnect from 78.39.48.166 port 8861:11: Bye Bye [preauth]
Sep 30 15:39:53 np0005463148.novalocal sshd-session[1416]: Disconnected from invalid user foundry 78.39.48.166 port 8861 [preauth]
Sep 30 15:40:37 np0005463148.novalocal sshd-session[1418]: Invalid user ftpuser1 from 103.57.64.214 port 34704
Sep 30 15:40:37 np0005463148.novalocal sshd-session[1418]: Received disconnect from 103.57.64.214 port 34704:11: Bye Bye [preauth]
Sep 30 15:40:37 np0005463148.novalocal sshd-session[1418]: Disconnected from invalid user ftpuser1 103.57.64.214 port 34704 [preauth]
Sep 30 15:40:52 np0005463148.novalocal sshd-session[1421]: Invalid user cdp from 78.39.48.166 port 39774
Sep 30 15:40:52 np0005463148.novalocal sshd-session[1421]: Received disconnect from 78.39.48.166 port 39774:11: Bye Bye [preauth]
Sep 30 15:40:52 np0005463148.novalocal sshd-session[1421]: Disconnected from invalid user cdp 78.39.48.166 port 39774 [preauth]
Sep 30 15:41:52 np0005463148.novalocal sshd-session[1423]: Invalid user tianyu from 78.39.48.166 port 47575
Sep 30 15:41:52 np0005463148.novalocal sshd-session[1423]: Received disconnect from 78.39.48.166 port 47575:11: Bye Bye [preauth]
Sep 30 15:41:52 np0005463148.novalocal sshd-session[1423]: Disconnected from invalid user tianyu 78.39.48.166 port 47575 [preauth]
Sep 30 15:41:56 np0005463148.novalocal sshd-session[1425]: Invalid user system from 185.156.73.233 port 40528
Sep 30 15:41:56 np0005463148.novalocal sshd-session[1426]: Invalid user zu from 103.57.64.214 port 53442
Sep 30 15:41:56 np0005463148.novalocal sshd-session[1426]: Received disconnect from 103.57.64.214 port 53442:11: Bye Bye [preauth]
Sep 30 15:41:56 np0005463148.novalocal sshd-session[1426]: Disconnected from invalid user zu 103.57.64.214 port 53442 [preauth]
Sep 30 15:41:56 np0005463148.novalocal sshd-session[1425]: Connection closed by invalid user system 185.156.73.233 port 40528 [preauth]
Sep 30 15:42:53 np0005463148.novalocal sshd-session[1430]: Invalid user vintagestory from 78.39.48.166 port 41344
Sep 30 15:42:54 np0005463148.novalocal sshd-session[1430]: Received disconnect from 78.39.48.166 port 41344:11: Bye Bye [preauth]
Sep 30 15:42:54 np0005463148.novalocal sshd-session[1430]: Disconnected from invalid user vintagestory 78.39.48.166 port 41344 [preauth]
Sep 30 15:43:02 np0005463148.novalocal anacron[1232]: Job `cron.daily' started
Sep 30 15:43:02 np0005463148.novalocal anacron[1232]: Job `cron.daily' terminated
Sep 30 15:43:14 np0005463148.novalocal sshd-session[1434]: Invalid user deploy from 103.57.64.214 port 38948
Sep 30 15:43:15 np0005463148.novalocal sshd-session[1434]: Received disconnect from 103.57.64.214 port 38948:11: Bye Bye [preauth]
Sep 30 15:43:15 np0005463148.novalocal sshd-session[1434]: Disconnected from invalid user deploy 103.57.64.214 port 38948 [preauth]
Sep 30 15:43:53 np0005463148.novalocal sshd-session[1436]: Invalid user test from 78.39.48.166 port 49607
Sep 30 15:43:53 np0005463148.novalocal sshd-session[1436]: Received disconnect from 78.39.48.166 port 49607:11: Bye Bye [preauth]
Sep 30 15:43:53 np0005463148.novalocal sshd-session[1436]: Disconnected from invalid user test 78.39.48.166 port 49607 [preauth]
Sep 30 15:44:29 np0005463148.novalocal sshd-session[1438]: Invalid user ubuntu from 103.57.64.214 port 49022
Sep 30 15:44:29 np0005463148.novalocal sshd-session[1438]: Received disconnect from 103.57.64.214 port 49022:11: Bye Bye [preauth]
Sep 30 15:44:29 np0005463148.novalocal sshd-session[1438]: Disconnected from invalid user ubuntu 103.57.64.214 port 49022 [preauth]
Sep 30 15:44:50 np0005463148.novalocal sshd-session[1441]: Invalid user anne from 78.39.48.166 port 11068
Sep 30 15:44:50 np0005463148.novalocal sshd-session[1441]: Received disconnect from 78.39.48.166 port 11068:11: Bye Bye [preauth]
Sep 30 15:44:50 np0005463148.novalocal sshd-session[1441]: Disconnected from invalid user anne 78.39.48.166 port 11068 [preauth]
Sep 30 15:45:45 np0005463148.novalocal sshd-session[1444]: Invalid user ubuntu from 103.57.64.214 port 45278
Sep 30 15:45:45 np0005463148.novalocal sshd-session[1444]: Received disconnect from 103.57.64.214 port 45278:11: Bye Bye [preauth]
Sep 30 15:45:45 np0005463148.novalocal sshd-session[1444]: Disconnected from invalid user ubuntu 103.57.64.214 port 45278 [preauth]
Sep 30 15:45:49 np0005463148.novalocal sshd-session[1446]: Invalid user trial from 78.39.48.166 port 35948
Sep 30 15:45:49 np0005463148.novalocal sshd-session[1446]: Received disconnect from 78.39.48.166 port 35948:11: Bye Bye [preauth]
Sep 30 15:45:49 np0005463148.novalocal sshd-session[1446]: Disconnected from invalid user trial 78.39.48.166 port 35948 [preauth]
Sep 30 15:46:46 np0005463148.novalocal sshd-session[1450]: Received disconnect from 78.39.48.166 port 60198:11: Bye Bye [preauth]
Sep 30 15:46:46 np0005463148.novalocal sshd-session[1450]: Disconnected from authenticating user root 78.39.48.166 port 60198 [preauth]
Sep 30 15:46:58 np0005463148.novalocal sshd-session[1453]: Invalid user test from 103.57.64.214 port 50106
Sep 30 15:46:58 np0005463148.novalocal sshd-session[1453]: Received disconnect from 103.57.64.214 port 50106:11: Bye Bye [preauth]
Sep 30 15:46:58 np0005463148.novalocal sshd-session[1453]: Disconnected from invalid user test 103.57.64.214 port 50106 [preauth]
Sep 30 15:47:45 np0005463148.novalocal sshd-session[1456]: Invalid user support from 78.39.48.166 port 45256
Sep 30 15:47:45 np0005463148.novalocal sshd-session[1456]: Received disconnect from 78.39.48.166 port 45256:11: Bye Bye [preauth]
Sep 30 15:47:45 np0005463148.novalocal sshd-session[1456]: Disconnected from invalid user support 78.39.48.166 port 45256 [preauth]
Sep 30 15:48:13 np0005463148.novalocal sshd-session[1458]: Received disconnect from 103.57.64.214 port 50016:11: Bye Bye [preauth]
Sep 30 15:48:13 np0005463148.novalocal sshd-session[1458]: Disconnected from authenticating user root 103.57.64.214 port 50016 [preauth]
Sep 30 15:48:49 np0005463148.novalocal sshd-session[1460]: Invalid user minecraft from 78.39.48.166 port 62246
Sep 30 15:48:49 np0005463148.novalocal sshd-session[1460]: Received disconnect from 78.39.48.166 port 62246:11: Bye Bye [preauth]
Sep 30 15:48:49 np0005463148.novalocal sshd-session[1460]: Disconnected from invalid user minecraft 78.39.48.166 port 62246 [preauth]
Sep 30 15:49:31 np0005463148.novalocal sshd-session[1462]: Received disconnect from 103.57.64.214 port 41046:11: Bye Bye [preauth]
Sep 30 15:49:31 np0005463148.novalocal sshd-session[1462]: Disconnected from authenticating user root 103.57.64.214 port 41046 [preauth]
Sep 30 15:49:49 np0005463148.novalocal sshd-session[1464]: Invalid user erp from 78.39.48.166 port 61133
Sep 30 15:49:49 np0005463148.novalocal sshd-session[1464]: Received disconnect from 78.39.48.166 port 61133:11: Bye Bye [preauth]
Sep 30 15:49:49 np0005463148.novalocal sshd-session[1464]: Disconnected from invalid user erp 78.39.48.166 port 61133 [preauth]
Sep 30 15:50:47 np0005463148.novalocal sshd-session[1468]: Invalid user sysadmin from 103.57.64.214 port 40454
Sep 30 15:50:47 np0005463148.novalocal sshd-session[1468]: Received disconnect from 103.57.64.214 port 40454:11: Bye Bye [preauth]
Sep 30 15:50:47 np0005463148.novalocal sshd-session[1468]: Disconnected from invalid user sysadmin 103.57.64.214 port 40454 [preauth]
Sep 30 15:50:49 np0005463148.novalocal sshd-session[1470]: Received disconnect from 78.39.48.166 port 27921:11: Bye Bye [preauth]
Sep 30 15:50:49 np0005463148.novalocal sshd-session[1470]: Disconnected from authenticating user root 78.39.48.166 port 27921 [preauth]
Sep 30 15:51:14 np0005463148.novalocal sshd-session[1473]: Invalid user pi from 80.94.95.115 port 30624
Sep 30 15:51:14 np0005463148.novalocal sshd-session[1473]: Connection closed by invalid user pi 80.94.95.115 port 30624 [preauth]
Sep 30 15:51:48 np0005463148.novalocal sshd-session[1475]: Invalid user ksiegowosc from 78.39.48.166 port 11947
Sep 30 15:51:49 np0005463148.novalocal sshd-session[1475]: Received disconnect from 78.39.48.166 port 11947:11: Bye Bye [preauth]
Sep 30 15:51:49 np0005463148.novalocal sshd-session[1475]: Disconnected from invalid user ksiegowosc 78.39.48.166 port 11947 [preauth]
Sep 30 15:52:04 np0005463148.novalocal sshd-session[1477]: Invalid user ftptest from 103.57.64.214 port 55040
Sep 30 15:52:04 np0005463148.novalocal sshd-session[1477]: Received disconnect from 103.57.64.214 port 55040:11: Bye Bye [preauth]
Sep 30 15:52:04 np0005463148.novalocal sshd-session[1477]: Disconnected from invalid user ftptest 103.57.64.214 port 55040 [preauth]
Sep 30 15:52:47 np0005463148.novalocal sshd-session[1480]: Invalid user webuser from 78.39.48.166 port 34289
Sep 30 15:52:47 np0005463148.novalocal sshd-session[1480]: Received disconnect from 78.39.48.166 port 34289:11: Bye Bye [preauth]
Sep 30 15:52:47 np0005463148.novalocal sshd-session[1480]: Disconnected from invalid user webuser 78.39.48.166 port 34289 [preauth]
Sep 30 15:53:10 np0005463148.novalocal sshd-session[1482]: Connection closed by 103.29.70.204 port 44916 [preauth]
Sep 30 15:53:20 np0005463148.novalocal sshd-session[1484]: Received disconnect from 103.57.64.214 port 47518:11: Bye Bye [preauth]
Sep 30 15:53:20 np0005463148.novalocal sshd-session[1484]: Disconnected from authenticating user root 103.57.64.214 port 47518 [preauth]
Sep 30 15:53:44 np0005463148.novalocal sshd-session[1487]: Invalid user gateway from 78.39.48.166 port 56713
Sep 30 15:53:44 np0005463148.novalocal sshd-session[1487]: Received disconnect from 78.39.48.166 port 56713:11: Bye Bye [preauth]
Sep 30 15:53:44 np0005463148.novalocal sshd-session[1487]: Disconnected from invalid user gateway 78.39.48.166 port 56713 [preauth]
Sep 30 15:54:37 np0005463148.novalocal sshd-session[1489]: Invalid user staging from 103.57.64.214 port 45162
Sep 30 15:54:37 np0005463148.novalocal sshd-session[1489]: Received disconnect from 103.57.64.214 port 45162:11: Bye Bye [preauth]
Sep 30 15:54:37 np0005463148.novalocal sshd-session[1489]: Disconnected from invalid user staging 103.57.64.214 port 45162 [preauth]
Sep 30 15:54:44 np0005463148.novalocal sshd-session[1491]: Invalid user ftptest2 from 78.39.48.166 port 51580
Sep 30 15:54:44 np0005463148.novalocal sshd-session[1491]: Received disconnect from 78.39.48.166 port 51580:11: Bye Bye [preauth]
Sep 30 15:54:44 np0005463148.novalocal sshd-session[1491]: Disconnected from invalid user ftptest2 78.39.48.166 port 51580 [preauth]
Sep 30 15:55:44 np0005463148.novalocal sshd-session[1494]: Received disconnect from 78.39.48.166 port 31165:11: Bye Bye [preauth]
Sep 30 15:55:44 np0005463148.novalocal sshd-session[1494]: Disconnected from authenticating user root 78.39.48.166 port 31165 [preauth]
Sep 30 15:55:53 np0005463148.novalocal sshd-session[1496]: Invalid user minecraft from 103.57.64.214 port 52122
Sep 30 15:55:53 np0005463148.novalocal sshd-session[1496]: Received disconnect from 103.57.64.214 port 52122:11: Bye Bye [preauth]
Sep 30 15:55:53 np0005463148.novalocal sshd-session[1496]: Disconnected from invalid user minecraft 103.57.64.214 port 52122 [preauth]
Sep 30 15:56:43 np0005463148.novalocal sshd-session[1498]: Invalid user nikita from 78.39.48.166 port 9663
Sep 30 15:56:43 np0005463148.novalocal sshd-session[1498]: Received disconnect from 78.39.48.166 port 9663:11: Bye Bye [preauth]
Sep 30 15:56:43 np0005463148.novalocal sshd-session[1498]: Disconnected from invalid user nikita 78.39.48.166 port 9663 [preauth]
Sep 30 15:57:08 np0005463148.novalocal sshd-session[1500]: Invalid user celeryuser from 103.57.64.214 port 59072
Sep 30 15:57:08 np0005463148.novalocal sshd-session[1500]: Received disconnect from 103.57.64.214 port 59072:11: Bye Bye [preauth]
Sep 30 15:57:08 np0005463148.novalocal sshd-session[1500]: Disconnected from invalid user celeryuser 103.57.64.214 port 59072 [preauth]
Sep 30 15:57:41 np0005463148.novalocal sshd-session[1502]: Invalid user superadmin from 78.39.48.166 port 41074
Sep 30 15:57:41 np0005463148.novalocal sshd-session[1502]: Received disconnect from 78.39.48.166 port 41074:11: Bye Bye [preauth]
Sep 30 15:57:41 np0005463148.novalocal sshd-session[1502]: Disconnected from invalid user superadmin 78.39.48.166 port 41074 [preauth]
Sep 30 15:58:24 np0005463148.novalocal sshd-session[1504]: Received disconnect from 103.57.64.214 port 50574:11: Bye Bye [preauth]
Sep 30 15:58:24 np0005463148.novalocal sshd-session[1504]: Disconnected from authenticating user root 103.57.64.214 port 50574 [preauth]
Sep 30 15:58:41 np0005463148.novalocal sshd-session[1506]: Received disconnect from 78.39.48.166 port 2305:11: Bye Bye [preauth]
Sep 30 15:58:41 np0005463148.novalocal sshd-session[1506]: Disconnected from authenticating user root 78.39.48.166 port 2305 [preauth]
Sep 30 15:59:38 np0005463148.novalocal sshd-session[1508]: Invalid user admin from 80.94.95.112 port 25258
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1509]: Invalid user dev from 103.57.64.214 port 53004
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1509]: Received disconnect from 103.57.64.214 port 53004:11: Bye Bye [preauth]
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1509]: Disconnected from invalid user dev 103.57.64.214 port 53004 [preauth]
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1511]: Received disconnect from 78.39.48.166 port 56975:11: Bye Bye [preauth]
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1511]: Disconnected from authenticating user root 78.39.48.166 port 56975 [preauth]
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1508]: Received disconnect from 80.94.95.112 port 25258:11: Bye [preauth]
Sep 30 15:59:39 np0005463148.novalocal sshd-session[1508]: Disconnected from invalid user admin 80.94.95.112 port 25258 [preauth]
Sep 30 16:00:48 np0005463148.novalocal sshd-session[1515]: Connection closed by authenticating user root 185.156.73.233 port 19002 [preauth]
Sep 30 16:00:55 np0005463148.novalocal sshd-session[1517]: Invalid user rootadmin from 103.57.64.214 port 59352
Sep 30 16:00:55 np0005463148.novalocal sshd-session[1517]: Received disconnect from 103.57.64.214 port 59352:11: Bye Bye [preauth]
Sep 30 16:00:55 np0005463148.novalocal sshd-session[1517]: Disconnected from invalid user rootadmin 103.57.64.214 port 59352 [preauth]
Sep 30 16:01:01 np0005463148.novalocal CROND[1520]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 16:01:01 np0005463148.novalocal run-parts[1523]: (/etc/cron.hourly) starting 0anacron
Sep 30 16:01:01 np0005463148.novalocal run-parts[1529]: (/etc/cron.hourly) finished 0anacron
Sep 30 16:01:01 np0005463148.novalocal CROND[1519]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 16:02:08 np0005463148.novalocal sshd-session[1531]: Connection closed by authenticating user root 80.94.95.115 port 34056 [preauth]
Sep 30 16:02:13 np0005463148.novalocal sshd-session[1533]: Received disconnect from 103.57.64.214 port 45418:11: Bye Bye [preauth]
Sep 30 16:02:13 np0005463148.novalocal sshd-session[1533]: Disconnected from authenticating user root 103.57.64.214 port 45418 [preauth]
Sep 30 16:03:01 np0005463148.novalocal anacron[1232]: Job `cron.weekly' started
Sep 30 16:03:01 np0005463148.novalocal anacron[1232]: Job `cron.weekly' terminated
Sep 30 16:03:35 np0005463148.novalocal sshd-session[1538]: Invalid user bot2 from 103.57.64.214 port 52478
Sep 30 16:03:35 np0005463148.novalocal sshd-session[1538]: Received disconnect from 103.57.64.214 port 52478:11: Bye Bye [preauth]
Sep 30 16:03:35 np0005463148.novalocal sshd-session[1538]: Disconnected from invalid user bot2 103.57.64.214 port 52478 [preauth]
Sep 30 16:04:51 np0005463148.novalocal sshd-session[1540]: Received disconnect from 103.57.64.214 port 48096:11: Bye Bye [preauth]
Sep 30 16:04:51 np0005463148.novalocal sshd-session[1540]: Disconnected from authenticating user root 103.57.64.214 port 48096 [preauth]
Sep 30 16:06:10 np0005463148.novalocal sshd-session[1542]: Received disconnect from 103.57.64.214 port 45130:11: Bye Bye [preauth]
Sep 30 16:06:10 np0005463148.novalocal sshd-session[1542]: Disconnected from authenticating user root 103.57.64.214 port 45130 [preauth]
Sep 30 16:07:30 np0005463148.novalocal sshd-session[1546]: Received disconnect from 103.57.64.214 port 56476:11: Bye Bye [preauth]
Sep 30 16:07:30 np0005463148.novalocal sshd-session[1546]: Disconnected from authenticating user root 103.57.64.214 port 56476 [preauth]
Sep 30 16:08:39 np0005463148.novalocal sshd-session[1548]: Invalid user admin from 78.128.112.74 port 55816
Sep 30 16:08:39 np0005463148.novalocal sshd-session[1548]: Connection closed by invalid user admin 78.128.112.74 port 55816 [preauth]
Sep 30 16:08:48 np0005463148.novalocal sshd-session[1550]: Invalid user foundry from 103.57.64.214 port 50712
Sep 30 16:08:48 np0005463148.novalocal sshd-session[1550]: Received disconnect from 103.57.64.214 port 50712:11: Bye Bye [preauth]
Sep 30 16:08:48 np0005463148.novalocal sshd-session[1550]: Disconnected from invalid user foundry 103.57.64.214 port 50712 [preauth]
Sep 30 16:10:03 np0005463148.novalocal sshd-session[1553]: Received disconnect from 103.57.64.214 port 46618:11: Bye Bye [preauth]
Sep 30 16:10:03 np0005463148.novalocal sshd-session[1553]: Disconnected from authenticating user root 103.57.64.214 port 46618 [preauth]
Sep 30 16:11:32 np0005463148.novalocal sshd-session[1555]: Invalid user backups from 185.156.73.233 port 16424
Sep 30 16:11:32 np0005463148.novalocal sshd-session[1555]: Connection closed by invalid user backups 185.156.73.233 port 16424 [preauth]
Sep 30 16:18:00 np0005463148.novalocal systemd[1]: Starting dnf makecache...
Sep 30 16:18:00 np0005463148.novalocal dnf[1561]: Metadata cache refreshed recently.
Sep 30 16:18:00 np0005463148.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 16:18:00 np0005463148.novalocal systemd[1]: Finished dnf makecache.
Sep 30 16:20:10 np0005463148.novalocal sshd-session[1564]: Invalid user  from 47.239.3.25 port 36240
Sep 30 16:20:17 np0005463148.novalocal sshd-session[1564]: Connection closed by invalid user  47.239.3.25 port 36240 [preauth]
Sep 30 16:21:26 np0005463148.novalocal sshd-session[1566]: Connection closed by authenticating user root 80.94.95.116 port 46514 [preauth]
Sep 30 16:23:01 np0005463148.novalocal anacron[1232]: Job `cron.monthly' started
Sep 30 16:23:01 np0005463148.novalocal anacron[1232]: Job `cron.monthly' terminated
Sep 30 16:23:01 np0005463148.novalocal anacron[1232]: Normal exit (3 jobs run)
Sep 30 16:26:56 np0005463148.novalocal sshd-session[1572]: Unable to negotiate with 54.183.87.63 port 42666: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 16:26:57 np0005463148.novalocal sshd-session[1574]: Connection closed by 54.183.87.63 port 42676 [preauth]
Sep 30 16:26:57 np0005463148.novalocal sshd-session[1576]: Unable to negotiate with 54.183.87.63 port 42692: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 16:26:58 np0005463148.novalocal sshd-session[1578]: Unable to negotiate with 54.183.87.63 port 42706: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 16:26:58 np0005463148.novalocal sshd-session[1580]: Connection closed by 54.183.87.63 port 42708 [preauth]
Sep 30 16:26:59 np0005463148.novalocal sshd-session[1582]: Connection closed by 54.183.87.63 port 42722 [preauth]
Sep 30 16:27:00 np0005463148.novalocal sshd-session[1584]: Unable to negotiate with 54.183.87.63 port 42732: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 16:27:00 np0005463148.novalocal sshd-session[1586]: Unable to negotiate with 54.183.87.63 port 42734: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 16:31:59 np0005463148.novalocal sshd-session[1590]: Connection closed by authenticating user root 80.94.95.115 port 52890 [preauth]
Sep 30 16:34:48 np0005463148.novalocal sshd-session[1594]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 16:34:48 np0005463148.novalocal sshd-session[1594]: Connection reset by 45.140.17.97 port 36833
Sep 30 16:40:45 np0005463148.novalocal sshd-session[1597]: Connection closed by authenticating user root 80.94.95.116 port 37446 [preauth]
Sep 30 16:41:49 np0005463148.novalocal sshd-session[1600]: Connection closed by 45.148.10.240 port 40214
Sep 30 16:49:39 np0005463148.novalocal sshd-session[1603]: Connection closed by authenticating user root 80.94.95.115 port 37274 [preauth]
Sep 30 16:49:58 np0005463148.novalocal sshd-session[1605]: Invalid user git2 from 120.48.39.224 port 60938
Sep 30 16:49:58 np0005463148.novalocal sshd-session[1605]: Received disconnect from 120.48.39.224 port 60938:11: Bye Bye [preauth]
Sep 30 16:49:58 np0005463148.novalocal sshd-session[1605]: Disconnected from invalid user git2 120.48.39.224 port 60938 [preauth]
Sep 30 16:55:36 np0005463148.novalocal sshd-session[1610]: Invalid user solana from 45.148.10.240 port 43570
Sep 30 16:55:36 np0005463148.novalocal sshd-session[1610]: Connection closed by invalid user solana 45.148.10.240 port 43570 [preauth]
Sep 30 16:57:04 np0005463148.novalocal sshd-session[1616]: Connection reset by 198.235.24.100 port 58538 [preauth]
Sep 30 16:57:15 np0005463148.novalocal sshd-session[1618]: Accepted publickey for zuul from 38.102.83.114 port 43012 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Created slice User Slice of UID 1000.
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Sep 30 16:57:15 np0005463148.novalocal systemd-logind[789]: New session 1 of user zuul.
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Starting User Manager for UID 1000...
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Queued start job for default target Main User Target.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Created slice User Application Slice.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Reached target Paths.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Reached target Timers.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Starting D-Bus User Message Bus Socket...
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Starting Create User's Volatile Files and Directories...
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Listening on D-Bus User Message Bus Socket.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Reached target Sockets.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Finished Create User's Volatile Files and Directories.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Reached target Basic System.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Reached target Main User Target.
Sep 30 16:57:15 np0005463148.novalocal systemd[1622]: Startup finished in 172ms.
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Started User Manager for UID 1000.
Sep 30 16:57:15 np0005463148.novalocal systemd[1]: Started Session 1 of User zuul.
Sep 30 16:57:15 np0005463148.novalocal sshd-session[1618]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 16:57:16 np0005463148.novalocal python3[1707]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 16:57:19 np0005463148.novalocal python3[1735]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 16:57:25 np0005463148.novalocal python3[1793]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 16:57:26 np0005463148.novalocal python3[1833]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Sep 30 16:57:28 np0005463148.novalocal python3[1859]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJZRnlw5lxA7UBiwhIWHL9fYC6NNPpslBJpmHQk/tT7YcFtPXtFCFVX5DH+DwaLlXqqjlCzUzh7V1c8ItxRgp2aE1CNn5dfciCNfCvpELE5UvrVEHMg+Yn9jZdXBENoj2Ph3y9cVfl6lDKws6pKufo8fW/z65wwOVxAiJyYhDb4BueCFuOA8UT9u+O3aB1TSXMJe9jxldV6kUwN5sJ2cJkm9SBDd++KtEKnG7yuw6SOhh3PCNwlzKOy4McnzAXF1P2vuYvJBS+53c221epEc5ZDxcTkCBndN/OSDxnL7pVGjWkNS1eplYJ03PmNPFNRsjyrhlShMEiKrNoTroSY1HLsSdFpCfK2roJTQHnzkl4QnsZXI76ZldD/rU370gz4wDHAJm7TrUTz0scMRAOgIIH5hD7XqiVcOH9+Y2KVHFvvKXZAWgvpuozlqalQbU3/Cnb6dA7NP6tinef3MvzTyxR5BhyowXb1gha8eX9XmFoQlc9ndpgKD1dSgHYy2pQxas= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:29 np0005463148.novalocal python3[1883]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:29 np0005463148.novalocal python3[1982]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:30 np0005463148.novalocal python3[2053]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759251449.2078838-230-191590398789636/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=9a7c1f9fca9042f6bc5bf6667fbd0ac7_id_rsa follow=False checksum=67b7651487d5caa423673a36898f80db9e6af5ef backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:30 np0005463148.novalocal python3[2176]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:31 np0005463148.novalocal python3[2247]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759251450.3370025-274-42336925964320/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=9a7c1f9fca9042f6bc5bf6667fbd0ac7_id_rsa.pub follow=False checksum=f21d6bceda5d43e7671c6bab343ac33d3c1e3c12 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:31 np0005463148.novalocal sshd-session[2272]: Connection closed by 167.71.248.239 port 42254
Sep 30 16:57:32 np0005463148.novalocal python3[2296]: ansible-ping Invoked with data=pong
Sep 30 16:57:33 np0005463148.novalocal python3[2320]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 16:57:35 np0005463148.novalocal python3[2378]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Sep 30 16:57:36 np0005463148.novalocal python3[2410]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:36 np0005463148.novalocal python3[2434]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:37 np0005463148.novalocal python3[2458]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:37 np0005463148.novalocal python3[2482]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:37 np0005463148.novalocal python3[2506]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:38 np0005463148.novalocal python3[2530]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:39 np0005463148.novalocal sudo[2554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhknkfwbuheqwaznffgynwisfwvrbqjr ; /usr/bin/python3'
Sep 30 16:57:39 np0005463148.novalocal sudo[2554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:39 np0005463148.novalocal python3[2556]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:39 np0005463148.novalocal sudo[2554]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:40 np0005463148.novalocal sudo[2632]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwfykgtmizdamiwlmynyskwegxvdwqp ; /usr/bin/python3'
Sep 30 16:57:40 np0005463148.novalocal sudo[2632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:40 np0005463148.novalocal python3[2634]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:40 np0005463148.novalocal sudo[2632]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:41 np0005463148.novalocal sudo[2705]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyidepizzvtodhfhvljreewmhcachjfu ; /usr/bin/python3'
Sep 30 16:57:41 np0005463148.novalocal sudo[2705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:41 np0005463148.novalocal python3[2707]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759251460.0619483-28-254280195982109/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:41 np0005463148.novalocal sudo[2705]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:41 np0005463148.novalocal python3[2755]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:42 np0005463148.novalocal python3[2779]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:42 np0005463148.novalocal python3[2803]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:42 np0005463148.novalocal python3[2827]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:42 np0005463148.novalocal python3[2851]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:43 np0005463148.novalocal python3[2875]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:43 np0005463148.novalocal python3[2899]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:43 np0005463148.novalocal python3[2923]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:44 np0005463148.novalocal python3[2947]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:44 np0005463148.novalocal python3[2971]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:44 np0005463148.novalocal python3[2995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:44 np0005463148.novalocal python3[3019]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:45 np0005463148.novalocal python3[3043]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:45 np0005463148.novalocal python3[3067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:45 np0005463148.novalocal python3[3091]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:45 np0005463148.novalocal python3[3115]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:46 np0005463148.novalocal python3[3139]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:46 np0005463148.novalocal python3[3163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:46 np0005463148.novalocal python3[3187]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:46 np0005463148.novalocal python3[3211]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:47 np0005463148.novalocal python3[3235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:47 np0005463148.novalocal python3[3259]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:47 np0005463148.novalocal python3[3283]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:47 np0005463148.novalocal python3[3307]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:48 np0005463148.novalocal python3[3331]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:48 np0005463148.novalocal python3[3355]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 16:57:50 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 120.48.39.224 to 38.102.83.102, pid = 1613
Sep 30 16:57:51 np0005463148.novalocal sudo[3379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qniezuhroqqyzalzjohiewygukccsspy ; /usr/bin/python3'
Sep 30 16:57:51 np0005463148.novalocal sudo[3379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:51 np0005463148.novalocal python3[3381]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 16:57:51 np0005463148.novalocal systemd[1]: Starting Time & Date Service...
Sep 30 16:57:51 np0005463148.novalocal systemd[1]: Started Time & Date Service.
Sep 30 16:57:51 np0005463148.novalocal systemd-timedated[3383]: Changed time zone to 'UTC' (UTC).
Sep 30 16:57:52 np0005463148.novalocal sudo[3379]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:52 np0005463148.novalocal sudo[3410]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhllxjnsdrrrlyokfstmwcqcethnmvxc ; /usr/bin/python3'
Sep 30 16:57:52 np0005463148.novalocal sudo[3410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:52 np0005463148.novalocal python3[3412]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:52 np0005463148.novalocal sudo[3410]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:52 np0005463148.novalocal python3[3488]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:53 np0005463148.novalocal python3[3559]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759251472.5559256-203-242008161358326/source _original_basename=tmpzrub527p follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:53 np0005463148.novalocal python3[3659]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:53 np0005463148.novalocal python3[3730]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759251473.420883-243-88014781993517/source _original_basename=tmpybl370vx follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:54 np0005463148.novalocal sudo[3830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-momshtvfqmuzjcyeyouqoiungwqtdsar ; /usr/bin/python3'
Sep 30 16:57:54 np0005463148.novalocal sudo[3830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:54 np0005463148.novalocal python3[3832]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:54 np0005463148.novalocal sudo[3830]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:55 np0005463148.novalocal sudo[3903]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vltjrswfivnzlaoocnurtejktsuocnen ; /usr/bin/python3'
Sep 30 16:57:55 np0005463148.novalocal sudo[3903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:55 np0005463148.novalocal python3[3905]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759251474.5811327-307-186407928040262/source _original_basename=tmpase9ks7r follow=False checksum=e56bb1e67423c4fe1cc5cef0e29768a51facf90f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:55 np0005463148.novalocal sudo[3903]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:55 np0005463148.novalocal python3[3953]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 16:57:55 np0005463148.novalocal python3[3979]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 16:57:56 np0005463148.novalocal sudo[4057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qocmaeudetxkqzpengylchdapsgkjjgm ; /usr/bin/python3'
Sep 30 16:57:56 np0005463148.novalocal sudo[4057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:56 np0005463148.novalocal python3[4059]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:57:56 np0005463148.novalocal sudo[4057]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:56 np0005463148.novalocal sudo[4130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obuufvbgftyoqgyihleulbrgpflnhzxm ; /usr/bin/python3'
Sep 30 16:57:56 np0005463148.novalocal sudo[4130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:56 np0005463148.novalocal python3[4132]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759251476.2138743-363-206050816024297/source _original_basename=tmpyys2moca follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:57:56 np0005463148.novalocal sudo[4130]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:57 np0005463148.novalocal sudo[4181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgpuhcvlqazrdvwfhgacmyzwtttoiteh ; /usr/bin/python3'
Sep 30 16:57:57 np0005463148.novalocal sudo[4181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:57:57 np0005463148.novalocal python3[4183]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-e63c-10db-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 16:57:57 np0005463148.novalocal sudo[4181]: pam_unix(sudo:session): session closed for user root
Sep 30 16:57:58 np0005463148.novalocal python3[4211]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-e63c-10db-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Sep 30 16:57:59 np0005463148.novalocal python3[4239]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:58:16 np0005463148.novalocal sudo[4263]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snduhaikpmffeipbfoctbeecarcahwtc ; /usr/bin/python3'
Sep 30 16:58:16 np0005463148.novalocal sudo[4263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:58:17 np0005463148.novalocal python3[4265]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:58:17 np0005463148.novalocal sudo[4263]: pam_unix(sudo:session): session closed for user root
Sep 30 16:58:22 np0005463148.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 16:58:22 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 120.48.39.224 to 38.102.83.102, pid = 1614
Sep 30 16:59:17 np0005463148.novalocal sshd-session[1634]: Received disconnect from 38.102.83.114 port 43012:11: disconnected by user
Sep 30 16:59:17 np0005463148.novalocal sshd-session[1634]: Disconnected from user zuul 38.102.83.114 port 43012
Sep 30 16:59:17 np0005463148.novalocal sshd-session[1618]: pam_unix(sshd:session): session closed for user zuul
Sep 30 16:59:17 np0005463148.novalocal systemd-logind[789]: Session 1 logged out. Waiting for processes to exit.
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Sep 30 16:59:20 np0005463148.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Sep 30 16:59:20 np0005463148.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2598] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 16:59:20 np0005463148.novalocal systemd-udevd[4269]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2738] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2768] settings: (eth1): created default wired connection 'Wired connection 1'
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2775] device (eth1): carrier: link connected
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2778] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2785] policy: auto-activating connection 'Wired connection 1' (673ac1e6-4892-3ac0-858b-84293dcaf668)
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2790] device (eth1): Activation: starting connection 'Wired connection 1' (673ac1e6-4892-3ac0-858b-84293dcaf668)
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2791] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2796] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2801] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 16:59:20 np0005463148.novalocal NetworkManager[859]: <info>  [1759251560.2806] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 16:59:20 np0005463148.novalocal systemd[1622]: Starting Mark boot as successful...
Sep 30 16:59:20 np0005463148.novalocal systemd[1622]: Finished Mark boot as successful.
Sep 30 16:59:21 np0005463148.novalocal sshd-session[4273]: Accepted publickey for zuul from 38.102.83.114 port 35632 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 16:59:21 np0005463148.novalocal systemd-logind[789]: New session 3 of user zuul.
Sep 30 16:59:21 np0005463148.novalocal systemd[1]: Started Session 3 of User zuul.
Sep 30 16:59:21 np0005463148.novalocal sshd-session[4273]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 16:59:21 np0005463148.novalocal python3[4300]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-574d-0424-000000000179-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 16:59:31 np0005463148.novalocal sudo[4379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykmlithwshffkjjkwvfvikrafcakpqwl ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 16:59:31 np0005463148.novalocal sudo[4379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:59:31 np0005463148.novalocal python3[4381]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 16:59:31 np0005463148.novalocal sudo[4379]: pam_unix(sudo:session): session closed for user root
Sep 30 16:59:31 np0005463148.novalocal sudo[4452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdodmthefbxawywyotyxcwuolyarwbsx ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 16:59:31 np0005463148.novalocal sudo[4452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:59:31 np0005463148.novalocal python3[4454]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759251571.1924138-154-192111592804263/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=7cab70fed0ea98f2f78cd90a65595e1f8365873a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 16:59:31 np0005463148.novalocal sudo[4452]: pam_unix(sudo:session): session closed for user root
Sep 30 16:59:32 np0005463148.novalocal sudo[4502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cglzfozwchcxsiydniormgopwoenghtv ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 16:59:32 np0005463148.novalocal sudo[4502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 16:59:32 np0005463148.novalocal python3[4504]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Stopped Network Manager Wait Online.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Stopping Network Manager Wait Online...
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Stopping Network Manager...
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4283] caught SIGTERM, shutting down normally.
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4301] dhcp4 (eth0): canceled DHCP transaction
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4302] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4302] dhcp4 (eth0): state changed no lease
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4307] manager: NetworkManager state is now CONNECTING
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4468] dhcp4 (eth1): canceled DHCP transaction
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4469] dhcp4 (eth1): state changed no lease
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[859]: <info>  [1759251572.4531] exiting (success)
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Stopped Network Manager.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: NetworkManager.service: Consumed 59.702s CPU time, 9.9M memory peak.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Starting Network Manager...
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.4995] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cf2a7137-0e0f-4f1a-866e-63b8011fce6c)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.4998] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5054] manager[0x55fdb8e4f070]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Starting Hostname Service...
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Started Hostname Service.
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5800] hostname: hostname: using hostnamed
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5801] hostname: static hostname changed from (none) to "np0005463148.novalocal"
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5806] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5810] manager[0x55fdb8e4f070]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5811] manager[0x55fdb8e4f070]: rfkill: WWAN hardware radio set enabled
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5846] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5846] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5847] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5847] manager: Networking is enabled by state file
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5849] settings: Loaded settings plugin: keyfile (internal)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5853] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5880] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5890] dhcp: init: Using DHCP client 'internal'
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5894] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5900] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5906] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5915] device (lo): Activation: starting connection 'lo' (9129f00f-203c-42c0-b87c-17b7d284cfa5)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5923] device (eth0): carrier: link connected
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5928] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5934] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5935] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5943] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5951] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5957] device (eth1): carrier: link connected
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5962] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5967] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (673ac1e6-4892-3ac0-858b-84293dcaf668) (indicated)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5967] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5973] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5981] device (eth1): Activation: starting connection 'Wired connection 1' (673ac1e6-4892-3ac0-858b-84293dcaf668)
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Started Network Manager.
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5987] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.5992] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6000] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6002] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6003] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6006] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6008] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6010] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6013] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6017] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6019] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6025] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6027] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6051] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6052] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6057] device (lo): Activation: successful, device activated.
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6063] dhcp4 (eth0): state changed new lease, address=38.102.83.102
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6070] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 16:59:32 np0005463148.novalocal sudo[4502]: pam_unix(sudo:session): session closed for user root
Sep 30 16:59:32 np0005463148.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6225] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6248] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6251] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6254] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6258] device (eth0): Activation: successful, device activated.
Sep 30 16:59:32 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251572.6262] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 16:59:32 np0005463148.novalocal python3[4589]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-574d-0424-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 16:59:42 np0005463148.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:00:02 np0005463148.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 17:00:09 np0005463148.novalocal sshd-session[4595]: Invalid user admin from 185.156.73.233 port 57188
Sep 30 17:00:09 np0005463148.novalocal sshd-session[4595]: Connection closed by invalid user admin 185.156.73.233 port 57188 [preauth]
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3614] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 17:00:17 np0005463148.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 17:00:17 np0005463148.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3960] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3962] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3970] device (eth1): Activation: successful, device activated.
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3977] manager: startup complete
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3978] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <warn>  [1759251617.3989] device (eth1): Activation: failed for connection 'Wired connection 1'
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.3994] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4138] dhcp4 (eth1): canceled DHCP transaction
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4138] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4138] dhcp4 (eth1): state changed no lease
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4152] policy: auto-activating connection 'ci-private-network' (c65e340a-eb16-53f4-aee4-b66cf391b2f7)
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4155] device (eth1): Activation: starting connection 'ci-private-network' (c65e340a-eb16-53f4-aee4-b66cf391b2f7)
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4156] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4157] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4162] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4167] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4254] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4256] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:00:17 np0005463148.novalocal NetworkManager[4514]: <info>  [1759251617.4262] device (eth1): Activation: successful, device activated.
Sep 30 17:00:27 np0005463148.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:00:32 np0005463148.novalocal sshd-session[4276]: Received disconnect from 38.102.83.114 port 35632:11: disconnected by user
Sep 30 17:00:32 np0005463148.novalocal sshd-session[4276]: Disconnected from user zuul 38.102.83.114 port 35632
Sep 30 17:00:32 np0005463148.novalocal sshd-session[4273]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:00:32 np0005463148.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Sep 30 17:00:32 np0005463148.novalocal systemd[1]: session-3.scope: Consumed 1.504s CPU time.
Sep 30 17:00:32 np0005463148.novalocal systemd-logind[789]: Session 3 logged out. Waiting for processes to exit.
Sep 30 17:00:32 np0005463148.novalocal systemd-logind[789]: Removed session 3.
Sep 30 17:00:38 np0005463148.novalocal sshd-session[4621]: Accepted publickey for zuul from 38.102.83.114 port 58444 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:00:38 np0005463148.novalocal systemd-logind[789]: New session 4 of user zuul.
Sep 30 17:00:38 np0005463148.novalocal systemd[1]: Started Session 4 of User zuul.
Sep 30 17:00:38 np0005463148.novalocal sshd-session[4621]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:00:38 np0005463148.novalocal sudo[4700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikppohbnuufdajuueqojzhqklrnrlutv ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 17:00:38 np0005463148.novalocal sudo[4700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:00:38 np0005463148.novalocal python3[4702]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:00:38 np0005463148.novalocal sudo[4700]: pam_unix(sudo:session): session closed for user root
Sep 30 17:00:38 np0005463148.novalocal sudo[4773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgnlhydyywmknnttikgtatdvrkeckwjb ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 17:00:38 np0005463148.novalocal sudo[4773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:00:38 np0005463148.novalocal python3[4775]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759251638.1815069-320-267611816261049/source _original_basename=tmp_fgyr1hl follow=False checksum=691c6e2d9962c1bc8a148fd3530a3221907fba75 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:00:38 np0005463148.novalocal sudo[4773]: pam_unix(sudo:session): session closed for user root
Sep 30 17:00:41 np0005463148.novalocal sshd-session[4624]: Connection closed by 38.102.83.114 port 58444
Sep 30 17:00:41 np0005463148.novalocal sshd-session[4621]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:00:41 np0005463148.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Sep 30 17:00:41 np0005463148.novalocal systemd-logind[789]: Session 4 logged out. Waiting for processes to exit.
Sep 30 17:00:41 np0005463148.novalocal systemd-logind[789]: Removed session 4.
Sep 30 17:01:01 np0005463148.novalocal CROND[4801]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 17:01:01 np0005463148.novalocal run-parts[4804]: (/etc/cron.hourly) starting 0anacron
Sep 30 17:01:01 np0005463148.novalocal run-parts[4810]: (/etc/cron.hourly) finished 0anacron
Sep 30 17:01:01 np0005463148.novalocal CROND[4800]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 17:03:00 np0005463148.novalocal systemd[1622]: Created slice User Background Tasks Slice.
Sep 30 17:03:00 np0005463148.novalocal systemd[1622]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 17:03:00 np0005463148.novalocal systemd[1622]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 17:04:02 np0005463148.novalocal sshd-session[4813]: Invalid user sol from 45.148.10.240 port 45156
Sep 30 17:04:02 np0005463148.novalocal sshd-session[4813]: Connection closed by invalid user sol 45.148.10.240 port 45156 [preauth]
Sep 30 17:04:51 np0005463148.novalocal sshd-session[4815]: Invalid user pos from 167.71.248.239 port 52240
Sep 30 17:04:51 np0005463148.novalocal sshd-session[4815]: Connection closed by invalid user pos 167.71.248.239 port 52240 [preauth]
Sep 30 17:05:52 np0005463148.novalocal sshd-session[4819]: Invalid user teste from 80.94.95.116 port 54284
Sep 30 17:05:52 np0005463148.novalocal sshd-session[4819]: Connection closed by invalid user teste 80.94.95.116 port 54284 [preauth]
Sep 30 17:06:13 np0005463148.novalocal sshd-session[4823]: Accepted publickey for zuul from 38.102.83.114 port 48798 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:06:13 np0005463148.novalocal systemd-logind[789]: New session 5 of user zuul.
Sep 30 17:06:13 np0005463148.novalocal systemd[1]: Started Session 5 of User zuul.
Sep 30 17:06:13 np0005463148.novalocal sshd-session[4823]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:06:13 np0005463148.novalocal sudo[4850]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuzpmjqljemkrfgfdxlaqwibxtnhbmzq ; /usr/bin/python3'
Sep 30 17:06:13 np0005463148.novalocal sudo[4850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:14 np0005463148.novalocal python3[4852]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-fa1f-9b67-000000001cf3-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:14 np0005463148.novalocal sudo[4850]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:14 np0005463148.novalocal sudo[4878]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmhttrwanjyjazvawdeusmrtwgsoghzd ; /usr/bin/python3'
Sep 30 17:06:14 np0005463148.novalocal sudo[4878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:14 np0005463148.novalocal python3[4880]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:06:14 np0005463148.novalocal sudo[4878]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:14 np0005463148.novalocal sudo[4904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxoxkeldnicdqtrtwputvlbqbzzoljsu ; /usr/bin/python3'
Sep 30 17:06:14 np0005463148.novalocal sudo[4904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:14 np0005463148.novalocal python3[4907]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:06:14 np0005463148.novalocal sudo[4904]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:14 np0005463148.novalocal sudo[4931]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csapijspcsakhgktktiokumhcxskzhvt ; /usr/bin/python3'
Sep 30 17:06:14 np0005463148.novalocal sudo[4931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:14 np0005463148.novalocal python3[4933]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:06:14 np0005463148.novalocal sudo[4931]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:15 np0005463148.novalocal sudo[4957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdgopxfecfejzllkxnaqohwjvmsiryna ; /usr/bin/python3'
Sep 30 17:06:15 np0005463148.novalocal sudo[4957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:15 np0005463148.novalocal python3[4959]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:06:15 np0005463148.novalocal sudo[4957]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:15 np0005463148.novalocal sudo[4983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mssargjujvwtbijzlfcqhdpuglettwnm ; /usr/bin/python3'
Sep 30 17:06:15 np0005463148.novalocal sudo[4983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:15 np0005463148.novalocal python3[4985]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:06:15 np0005463148.novalocal python3[4985]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Sep 30 17:06:15 np0005463148.novalocal sudo[4983]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:16 np0005463148.novalocal sudo[5009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvklgwpkxtdmtewdcxkzdxacmubsjzzh ; /usr/bin/python3'
Sep 30 17:06:16 np0005463148.novalocal sudo[5009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:16 np0005463148.novalocal python3[5011]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:06:16 np0005463148.novalocal systemd[1]: Reloading.
Sep 30 17:06:16 np0005463148.novalocal systemd-rc-local-generator[5031]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:06:16 np0005463148.novalocal sudo[5009]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:18 np0005463148.novalocal sudo[5065]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpfpkevhrpevozprqelqhabikyojmrsh ; /usr/bin/python3'
Sep 30 17:06:18 np0005463148.novalocal sudo[5065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:18 np0005463148.novalocal python3[5067]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Sep 30 17:06:18 np0005463148.novalocal sudo[5065]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:18 np0005463148.novalocal sudo[5091]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udhwoanmvdkmzgrcdnzwdczigbrxppyj ; /usr/bin/python3'
Sep 30 17:06:18 np0005463148.novalocal sudo[5091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:18 np0005463148.novalocal python3[5093]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:18 np0005463148.novalocal sudo[5091]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:19 np0005463148.novalocal sudo[5119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzvnmfkpacczdpkkhqorhbpzjafciaph ; /usr/bin/python3'
Sep 30 17:06:19 np0005463148.novalocal sudo[5119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:19 np0005463148.novalocal python3[5121]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:19 np0005463148.novalocal sudo[5119]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:19 np0005463148.novalocal sudo[5147]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zepcmedtppogrwerimvvonpiqdrugkan ; /usr/bin/python3'
Sep 30 17:06:19 np0005463148.novalocal sudo[5147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:19 np0005463148.novalocal python3[5149]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:19 np0005463148.novalocal sudo[5147]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:19 np0005463148.novalocal sudo[5175]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqhajoksghzqkcpfznlsdhgbkiudumbg ; /usr/bin/python3'
Sep 30 17:06:19 np0005463148.novalocal sudo[5175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:19 np0005463148.novalocal python3[5177]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:19 np0005463148.novalocal sudo[5175]: pam_unix(sudo:session): session closed for user root
Sep 30 17:06:20 np0005463148.novalocal python3[5204]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-fa1f-9b67-000000001cf9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:06:20 np0005463148.novalocal python3[5234]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:06:23 np0005463148.novalocal sshd-session[4826]: Connection closed by 38.102.83.114 port 48798
Sep 30 17:06:23 np0005463148.novalocal sshd-session[4823]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:06:23 np0005463148.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Sep 30 17:06:23 np0005463148.novalocal systemd[1]: session-5.scope: Consumed 3.403s CPU time.
Sep 30 17:06:23 np0005463148.novalocal systemd-logind[789]: Session 5 logged out. Waiting for processes to exit.
Sep 30 17:06:23 np0005463148.novalocal systemd-logind[789]: Removed session 5.
Sep 30 17:06:24 np0005463148.novalocal sshd-session[5240]: Accepted publickey for zuul from 38.102.83.114 port 52830 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:06:24 np0005463148.novalocal systemd-logind[789]: New session 6 of user zuul.
Sep 30 17:06:24 np0005463148.novalocal systemd[1]: Started Session 6 of User zuul.
Sep 30 17:06:24 np0005463148.novalocal sshd-session[5240]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:06:24 np0005463148.novalocal sudo[5267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfoqukpkrruxlixppkpcoqwmefrbrwog ; /usr/bin/python3'
Sep 30 17:06:24 np0005463148.novalocal sudo[5267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:06:25 np0005463148.novalocal python3[5269]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  Converting 366 SID table entries...
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:06:39 np0005463148.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  Converting 366 SID table entries...
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:06:48 np0005463148.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  Converting 366 SID table entries...
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:06:57 np0005463148.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:06:58 np0005463148.novalocal setsebool[5331]: The virt_use_nfs policy boolean was changed to 1 by root
Sep 30 17:06:58 np0005463148.novalocal setsebool[5331]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  Converting 369 SID table entries...
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:07:09 np0005463148.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:07:23 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 120.48.39.224 to 38.102.83.102, pid = 4817
Sep 30 17:07:27 np0005463148.novalocal dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 17:07:27 np0005463148.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:07:27 np0005463148.novalocal systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:07:27 np0005463148.novalocal systemd[1]: Reloading.
Sep 30 17:07:27 np0005463148.novalocal systemd-rc-local-generator[6093]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:07:27 np0005463148.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:07:28 np0005463148.novalocal systemd[1]: Starting PackageKit Daemon...
Sep 30 17:07:28 np0005463148.novalocal PackageKit[6767]: daemon start
Sep 30 17:07:28 np0005463148.novalocal systemd[1]: Starting Authorization Manager...
Sep 30 17:07:28 np0005463148.novalocal polkitd[6874]: Started polkitd version 0.117
Sep 30 17:07:28 np0005463148.novalocal polkitd[6874]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 17:07:28 np0005463148.novalocal polkitd[6874]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 17:07:28 np0005463148.novalocal polkitd[6874]: Finished loading, compiling and executing 3 rules
Sep 30 17:07:28 np0005463148.novalocal systemd[1]: Started Authorization Manager.
Sep 30 17:07:28 np0005463148.novalocal polkitd[6874]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Sep 30 17:07:28 np0005463148.novalocal systemd[1]: Started PackageKit Daemon.
Sep 30 17:07:28 np0005463148.novalocal sudo[5267]: pam_unix(sudo:session): session closed for user root
Sep 30 17:07:31 np0005463148.novalocal sshd-session[6047]: Connection closed by 120.48.39.224 port 44676 [preauth]
Sep 30 17:07:40 np0005463148.novalocal python3[13596]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-d4b9-f2f5-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:07:41 np0005463148.novalocal kernel: evm: overlay not supported
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: Starting D-Bus User Message Bus...
Sep 30 17:07:41 np0005463148.novalocal dbus-broker-launch[14027]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Sep 30 17:07:41 np0005463148.novalocal dbus-broker-launch[14027]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: Started D-Bus User Message Bus.
Sep 30 17:07:41 np0005463148.novalocal dbus-broker-lau[14027]: Ready
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: Created slice Slice /user.
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: podman-13956.scope: unit configures an IP firewall, but not running as root.
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: (This warning is only shown for the first unit using IP firewalling.)
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: Started podman-13956.scope.
Sep 30 17:07:41 np0005463148.novalocal systemd[1622]: Started podman-pause-ebb60712.scope.
Sep 30 17:07:42 np0005463148.novalocal sudo[14527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbfyafqivsrdsfnxseknaaluddfeslww ; /usr/bin/python3'
Sep 30 17:07:42 np0005463148.novalocal sudo[14527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:07:42 np0005463148.novalocal python3[14535]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.129.56.221:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.129.56.221:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:07:42 np0005463148.novalocal sudo[14527]: pam_unix(sudo:session): session closed for user root
Sep 30 17:07:43 np0005463148.novalocal sshd-session[5243]: Connection closed by 38.102.83.114 port 52830
Sep 30 17:07:43 np0005463148.novalocal sshd-session[5240]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:07:43 np0005463148.novalocal systemd-logind[789]: Session 6 logged out. Waiting for processes to exit.
Sep 30 17:07:43 np0005463148.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Sep 30 17:07:43 np0005463148.novalocal systemd[1]: session-6.scope: Consumed 58.734s CPU time.
Sep 30 17:07:43 np0005463148.novalocal systemd-logind[789]: Removed session 6.
Sep 30 17:07:50 np0005463148.novalocal sshd[1007]: drop connection #3 from [120.48.39.224]:35692 on [38.102.83.102]:22 penalty: connections without attempting authentication
Sep 30 17:07:52 np0005463148.novalocal sshd[1007]: Timeout before authentication for connection from 120.48.39.224 to 38.102.83.102, pid = 4818
Sep 30 17:08:02 np0005463148.novalocal sshd-session[21294]: Connection closed by 38.102.83.36 port 48182 [preauth]
Sep 30 17:08:02 np0005463148.novalocal sshd-session[21301]: Unable to negotiate with 38.102.83.36 port 48188: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 17:08:02 np0005463148.novalocal sshd-session[21300]: Unable to negotiate with 38.102.83.36 port 48194: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 17:08:02 np0005463148.novalocal sshd-session[21298]: Unable to negotiate with 38.102.83.36 port 48198: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 17:08:02 np0005463148.novalocal sshd-session[21296]: Connection closed by 38.102.83.36 port 48180 [preauth]
Sep 30 17:08:05 np0005463148.novalocal irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Sep 30 17:08:05 np0005463148.novalocal irqbalance[784]: IRQ 27 affinity is now unmanaged
Sep 30 17:08:06 np0005463148.novalocal sshd-session[22803]: Accepted publickey for zuul from 38.102.83.114 port 49806 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:08:06 np0005463148.novalocal systemd-logind[789]: New session 7 of user zuul.
Sep 30 17:08:06 np0005463148.novalocal systemd[1]: Started Session 7 of User zuul.
Sep 30 17:08:06 np0005463148.novalocal sshd-session[22803]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:08:07 np0005463148.novalocal python3[22890]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLdfvsAXcCq7Fq0F3Kitiz5SuHbqf8SH0K2JwdxHA7U8WZUl9eCqhG5JROBTXolfOebEF70oH6VJjv6QTxswAaY= zuul@np0005463146.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 17:08:07 np0005463148.novalocal sudo[23060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czfaopugdhiqszqvvsgbbetiasqjmljf ; /usr/bin/python3'
Sep 30 17:08:07 np0005463148.novalocal sudo[23060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:07 np0005463148.novalocal python3[23069]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLdfvsAXcCq7Fq0F3Kitiz5SuHbqf8SH0K2JwdxHA7U8WZUl9eCqhG5JROBTXolfOebEF70oH6VJjv6QTxswAaY= zuul@np0005463146.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 17:08:07 np0005463148.novalocal sudo[23060]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:08 np0005463148.novalocal sudo[23426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-govjlrwkbgamlwzgczmogucadgskksvu ; /usr/bin/python3'
Sep 30 17:08:08 np0005463148.novalocal sudo[23426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:08 np0005463148.novalocal python3[23435]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005463148.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Sep 30 17:08:08 np0005463148.novalocal useradd[23522]: new group: name=cloud-admin, GID=1002
Sep 30 17:08:08 np0005463148.novalocal useradd[23522]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Sep 30 17:08:08 np0005463148.novalocal sudo[23426]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:09 np0005463148.novalocal sudo[23961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apiopgyslfplbmkixgpwksoemapsdngh ; /usr/bin/python3'
Sep 30 17:08:09 np0005463148.novalocal sudo[23961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:09 np0005463148.novalocal python3[23968]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLdfvsAXcCq7Fq0F3Kitiz5SuHbqf8SH0K2JwdxHA7U8WZUl9eCqhG5JROBTXolfOebEF70oH6VJjv6QTxswAaY= zuul@np0005463146.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 17:08:09 np0005463148.novalocal sudo[23961]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:09 np0005463148.novalocal sudo[24221]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrapfqtexpfucbfskituppyluqtlwgap ; /usr/bin/python3'
Sep 30 17:08:09 np0005463148.novalocal sudo[24221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:09 np0005463148.novalocal python3[24231]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:08:09 np0005463148.novalocal sudo[24221]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:10 np0005463148.novalocal sudo[24474]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owrxuphzrymyzpcdimgqsegomynbvfjd ; /usr/bin/python3'
Sep 30 17:08:10 np0005463148.novalocal sudo[24474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:10 np0005463148.novalocal python3[24481]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759252089.62463-153-4506074423318/source _original_basename=tmp330rnaoq follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:08:10 np0005463148.novalocal sudo[24474]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:11 np0005463148.novalocal sudo[24778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjckvmpxsohhmpkqowsqwtyklxvwqhmh ; /usr/bin/python3'
Sep 30 17:08:11 np0005463148.novalocal sudo[24778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:08:11 np0005463148.novalocal python3[24790]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Sep 30 17:08:11 np0005463148.novalocal systemd[1]: Starting Hostname Service...
Sep 30 17:08:11 np0005463148.novalocal systemd[1]: Started Hostname Service.
Sep 30 17:08:11 np0005463148.novalocal systemd-hostnamed[24882]: Changed pretty hostname to 'compute-1'
Sep 30 17:08:11 compute-1 systemd-hostnamed[24882]: Hostname set to <compute-1> (static)
Sep 30 17:08:11 compute-1 NetworkManager[4514]: <info>  [1759252091.5257] hostname: static hostname changed from "np0005463148.novalocal" to "compute-1"
Sep 30 17:08:11 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 17:08:11 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 17:08:11 compute-1 sudo[24778]: pam_unix(sudo:session): session closed for user root
Sep 30 17:08:12 compute-1 sshd-session[22842]: Connection closed by 38.102.83.114 port 49806
Sep 30 17:08:12 compute-1 sshd-session[22803]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:08:12 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Sep 30 17:08:12 compute-1 systemd[1]: session-7.scope: Consumed 2.363s CPU time.
Sep 30 17:08:12 compute-1 systemd-logind[789]: Session 7 logged out. Waiting for processes to exit.
Sep 30 17:08:12 compute-1 systemd-logind[789]: Removed session 7.
Sep 30 17:08:18 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:08:18 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:08:18 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 612ms CPU time.
Sep 30 17:08:18 compute-1 systemd[1]: run-r166b8fb6113143b3b46c3511d97fd2f6.service: Deactivated successfully.
Sep 30 17:08:20 compute-1 sshd[1007]: drop connection #2 from [120.48.39.224]:42878 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:08:21 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:08:23 compute-1 sshd-session[5320]: Connection closed by 120.48.39.224 port 59920 [preauth]
Sep 30 17:08:24 compute-1 sshd[1007]: Timeout before authentication for connection from 120.48.39.224 to 38.102.83.102, pid = 5237
Sep 30 17:08:41 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 17:08:44 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:55278 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:09:10 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:56570 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:09:35 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:35820 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:09:58 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:54548 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:10:23 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:57436 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:10:49 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:53734 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:11:16 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:37986 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:11:40 compute-1 sshd[1007]: drop connection #0 from [120.48.39.224]:37480 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:12:05 compute-1 sshd-session[27125]: Invalid user admin from 120.48.39.224 port 38450
Sep 30 17:12:05 compute-1 sshd-session[27125]: Received disconnect from 120.48.39.224 port 38450:11: Bye Bye [preauth]
Sep 30 17:12:05 compute-1 sshd-session[27125]: Disconnected from invalid user admin 120.48.39.224 port 38450 [preauth]
Sep 30 17:12:20 compute-1 sshd-session[27127]: Invalid user ubuntu from 167.71.248.239 port 46958
Sep 30 17:12:20 compute-1 sshd-session[27127]: Connection closed by invalid user ubuntu 167.71.248.239 port 46958 [preauth]
Sep 30 17:12:29 compute-1 sshd-session[27129]: Invalid user testuser from 120.48.39.224 port 33634
Sep 30 17:12:29 compute-1 sshd-session[27129]: Received disconnect from 120.48.39.224 port 33634:11: Bye Bye [preauth]
Sep 30 17:12:29 compute-1 sshd-session[27129]: Disconnected from invalid user testuser 120.48.39.224 port 33634 [preauth]
Sep 30 17:12:30 compute-1 sshd-session[27131]: Accepted publickey for zuul from 38.102.83.36 port 45286 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:12:30 compute-1 systemd-logind[789]: New session 8 of user zuul.
Sep 30 17:12:30 compute-1 systemd[1]: Started Session 8 of User zuul.
Sep 30 17:12:30 compute-1 sshd-session[27131]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:12:30 compute-1 python3[27207]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:12:32 compute-1 sudo[27321]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdyqhxvrmlwaasjmzeodihfrtqrjnpjo ; /usr/bin/python3'
Sep 30 17:12:32 compute-1 sudo[27321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:32 compute-1 python3[27323]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:32 compute-1 sudo[27321]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:32 compute-1 sudo[27394]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzokrlfjdsqpqmhpvwyifsussrjcnble ; /usr/bin/python3'
Sep 30 17:12:32 compute-1 sudo[27394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:32 compute-1 python3[27396]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=delorean.repo follow=False checksum=6543f0d49313391d10c7b4b619155c98ddf76b9b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:32 compute-1 sudo[27394]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:32 compute-1 sudo[27420]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqnztuaypwuvzzggumdiiglghjbwlyt ; /usr/bin/python3'
Sep 30 17:12:32 compute-1 sudo[27420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:32 compute-1 python3[27422]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:32 compute-1 sudo[27420]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:33 compute-1 sudo[27493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtuipewszfwkodkcpslrosgbfmoxjmcb ; /usr/bin/python3'
Sep 30 17:12:33 compute-1 sudo[27493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:33 compute-1 python3[27495]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:33 compute-1 sudo[27493]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:33 compute-1 PackageKit[6767]: daemon quit
Sep 30 17:12:33 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 17:12:33 compute-1 sudo[27519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uilmopaucutzbsmskcimtwvfpgwmzhoa ; /usr/bin/python3'
Sep 30 17:12:33 compute-1 sudo[27519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:33 compute-1 python3[27521]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:33 compute-1 sudo[27519]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:33 compute-1 sudo[27592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqictozwplxsgrcbeiqsuvgcszkfhaca ; /usr/bin/python3'
Sep 30 17:12:33 compute-1 sudo[27592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:33 compute-1 python3[27594]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:33 compute-1 sudo[27592]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:33 compute-1 sudo[27618]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujhnqztbuzbketeqwmwvoqcjwckxuoig ; /usr/bin/python3'
Sep 30 17:12:33 compute-1 sudo[27618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:34 compute-1 python3[27620]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:34 compute-1 sudo[27618]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:34 compute-1 sudo[27691]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vluwqlebhfozzikbqwlxljyafvkfjqmj ; /usr/bin/python3'
Sep 30 17:12:34 compute-1 sudo[27691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:34 compute-1 python3[27693]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:34 compute-1 sudo[27691]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:34 compute-1 sudo[27717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtlesughlmuqpptxksiupiuxfamdtzqi ; /usr/bin/python3'
Sep 30 17:12:34 compute-1 sudo[27717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:34 compute-1 python3[27719]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:34 compute-1 sudo[27717]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:34 compute-1 sudo[27790]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxzclclyvizkceaayzqtuihajcdezywp ; /usr/bin/python3'
Sep 30 17:12:34 compute-1 sudo[27790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:34 compute-1 python3[27792]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:34 compute-1 sudo[27790]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:35 compute-1 sudo[27816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-steclgaohvwlmbmdunyxqedalzrdcxqj ; /usr/bin/python3'
Sep 30 17:12:35 compute-1 sudo[27816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:35 compute-1 python3[27818]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:35 compute-1 sudo[27816]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:35 compute-1 sudo[27889]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzpzhhneopxyuyglwtevpilpxaissmrx ; /usr/bin/python3'
Sep 30 17:12:35 compute-1 sudo[27889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:35 compute-1 python3[27891]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:35 compute-1 sudo[27889]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:35 compute-1 sudo[27915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-achqpwiyipnsvhiexvxmsippblsbjwcs ; /usr/bin/python3'
Sep 30 17:12:35 compute-1 sudo[27915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:35 compute-1 python3[27917]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:35 compute-1 sudo[27915]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:35 compute-1 sudo[27988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkjfypbiaedyqngtxxrxkrbykwmiwmwt ; /usr/bin/python3'
Sep 30 17:12:35 compute-1 sudo[27988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:36 compute-1 python3[27990]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=039facbb479fa58856d4f56208cb1f104e804408 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:36 compute-1 sudo[27988]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:36 compute-1 sudo[28014]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayvnwvykkpknwqlsjbmonbmfxbcejpac ; /usr/bin/python3'
Sep 30 17:12:36 compute-1 sudo[28014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:36 compute-1 python3[28016]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:12:36 compute-1 sudo[28014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:36 compute-1 sudo[28087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vujuyqcjzhugrozhvezajfqmzotapkaw ; /usr/bin/python3'
Sep 30 17:12:36 compute-1 sudo[28087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:12:36 compute-1 python3[28089]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759252351.9940293-30443-86743733294359/source mode=0755 _original_basename=gating.repo follow=False checksum=e31dc74caa36ffb4db145632be353eaf0e546b82 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:12:36 compute-1 sudo[28087]: pam_unix(sudo:session): session closed for user root
Sep 30 17:12:55 compute-1 sshd-session[28114]: Invalid user user from 120.48.39.224 port 51362
Sep 30 17:12:55 compute-1 sshd-session[28114]: Received disconnect from 120.48.39.224 port 51362:11: Bye Bye [preauth]
Sep 30 17:12:55 compute-1 sshd-session[28114]: Disconnected from invalid user user 120.48.39.224 port 51362 [preauth]
Sep 30 17:13:59 compute-1 python3[28139]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:15:32 compute-1 sshd-session[28141]: Invalid user test from 185.156.73.233 port 63822
Sep 30 17:15:32 compute-1 sshd-session[28141]: Connection closed by invalid user test 185.156.73.233 port 63822 [preauth]
Sep 30 17:18:59 compute-1 sshd-session[27134]: Received disconnect from 38.102.83.36 port 45286:11: disconnected by user
Sep 30 17:18:59 compute-1 sshd-session[27134]: Disconnected from user zuul 38.102.83.36 port 45286
Sep 30 17:18:59 compute-1 sshd-session[27131]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:18:59 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Sep 30 17:18:59 compute-1 systemd[1]: session-8.scope: Consumed 5.277s CPU time.
Sep 30 17:18:59 compute-1 systemd-logind[789]: Session 8 logged out. Waiting for processes to exit.
Sep 30 17:18:59 compute-1 systemd-logind[789]: Removed session 8.
Sep 30 17:19:50 compute-1 sshd-session[28145]: Invalid user ps from 167.71.248.239 port 49670
Sep 30 17:19:50 compute-1 sshd-session[28145]: Connection closed by invalid user ps 167.71.248.239 port 49670 [preauth]
Sep 30 17:20:58 compute-1 sshd-session[28147]: Invalid user ubuntu from 45.148.10.240 port 45914
Sep 30 17:20:58 compute-1 sshd-session[28147]: Connection closed by invalid user ubuntu 45.148.10.240 port 45914 [preauth]
Sep 30 17:25:40 compute-1 sshd-session[28151]: Connection closed by authenticating user operator 80.94.95.116 port 36704 [preauth]
Sep 30 17:27:00 compute-1 sshd-session[28154]: Accepted publickey for zuul from 192.168.122.30 port 38104 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:27:00 compute-1 systemd-logind[789]: New session 9 of user zuul.
Sep 30 17:27:00 compute-1 systemd[1]: Started Session 9 of User zuul.
Sep 30 17:27:00 compute-1 sshd-session[28154]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:27:01 compute-1 python3.9[28307]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:27:02 compute-1 sudo[28486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmqfstivntvspjfmcgihrirudaplrmjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253221.841074-45-264743082028108/AnsiballZ_command.py'
Sep 30 17:27:02 compute-1 sudo[28486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:02 compute-1 python3.9[28488]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:27:09 compute-1 sudo[28486]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:10 compute-1 sshd-session[28157]: Connection closed by 192.168.122.30 port 38104
Sep 30 17:27:10 compute-1 sshd-session[28154]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:27:10 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Sep 30 17:27:10 compute-1 systemd[1]: session-9.scope: Consumed 7.656s CPU time.
Sep 30 17:27:10 compute-1 systemd-logind[789]: Session 9 logged out. Waiting for processes to exit.
Sep 30 17:27:10 compute-1 systemd-logind[789]: Removed session 9.
Sep 30 17:27:20 compute-1 sshd-session[28545]: Connection closed by authenticating user root 167.71.248.239 port 58000 [preauth]
Sep 30 17:27:26 compute-1 sshd-session[28547]: Accepted publickey for zuul from 192.168.122.30 port 37660 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:27:26 compute-1 systemd-logind[789]: New session 10 of user zuul.
Sep 30 17:27:26 compute-1 systemd[1]: Started Session 10 of User zuul.
Sep 30 17:27:26 compute-1 sshd-session[28547]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:27:27 compute-1 python3.9[28700]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 17:27:28 compute-1 python3.9[28874]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:27:29 compute-1 sudo[29024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpkcckpxxobhkazquwgqwzxgqxxkqoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253249.1949403-70-80320766118986/AnsiballZ_command.py'
Sep 30 17:27:29 compute-1 sudo[29024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:29 compute-1 python3.9[29026]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:27:29 compute-1 sudo[29024]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:30 compute-1 sudo[29177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exdrgsxvhvzcrksrhohndxhfhbeyeqfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253250.1348443-94-213998032178043/AnsiballZ_stat.py'
Sep 30 17:27:30 compute-1 sudo[29177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:30 compute-1 python3.9[29179]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:27:30 compute-1 sudo[29177]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:31 compute-1 sudo[29329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqjdkebokomyrgfcdzpjdkmhxdjchibi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253250.9457242-110-52601528950731/AnsiballZ_file.py'
Sep 30 17:27:31 compute-1 sudo[29329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:31 compute-1 python3.9[29331]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:27:31 compute-1 sudo[29329]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:32 compute-1 sudo[29481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fchkplinfwqlouqutzangtqgmsmumvbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253251.7918868-126-11886817476291/AnsiballZ_stat.py'
Sep 30 17:27:32 compute-1 sudo[29481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:32 compute-1 python3.9[29483]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:27:32 compute-1 sudo[29481]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:32 compute-1 sudo[29604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqavuktqkyljzjomltsfbqxmvhnhijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253251.7918868-126-11886817476291/AnsiballZ_copy.py'
Sep 30 17:27:32 compute-1 sudo[29604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:33 compute-1 python3.9[29606]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253251.7918868-126-11886817476291/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:27:33 compute-1 sudo[29604]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:33 compute-1 sudo[29756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amqyolshjfeatuhfinvwpdzporfwujzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253253.288859-156-38333720846847/AnsiballZ_setup.py'
Sep 30 17:27:33 compute-1 sudo[29756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:33 compute-1 python3.9[29758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:27:34 compute-1 sudo[29756]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:34 compute-1 sudo[29912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvfufhfmuuubcsytfpttovqumefxkqis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253254.2254016-172-146649973798192/AnsiballZ_file.py'
Sep 30 17:27:34 compute-1 sudo[29912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:34 compute-1 python3.9[29914]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:27:34 compute-1 sudo[29912]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:35 compute-1 python3.9[30064]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:27:40 compute-1 python3.9[30319]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:27:41 compute-1 python3.9[30469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:27:42 compute-1 python3.9[30623]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:27:42 compute-1 sudo[30779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrqyjelxpcjymgxgyhwskinqpvzbqkwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253262.6984208-268-53841127219820/AnsiballZ_setup.py'
Sep 30 17:27:42 compute-1 sudo[30779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:43 compute-1 python3.9[30781]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:27:43 compute-1 sudo[30779]: pam_unix(sudo:session): session closed for user root
Sep 30 17:27:43 compute-1 sudo[30863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckeqhrqhkoitgkpahdvpocayigfryxdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253262.6984208-268-53841127219820/AnsiballZ_dnf.py'
Sep 30 17:27:43 compute-1 sudo[30863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:27:44 compute-1 python3.9[30865]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:28:05 compute-1 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Sep 30 17:28:05 compute-1 irqbalance[784]: IRQ 26 affinity is now unmanaged
Sep 30 17:28:25 compute-1 systemd[1]: Reloading.
Sep 30 17:28:25 compute-1 systemd-rc-local-generator[31056]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:28:25 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Sep 30 17:28:25 compute-1 systemd[1]: Reloading.
Sep 30 17:28:25 compute-1 systemd-rc-local-generator[31102]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:28:25 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Sep 30 17:28:25 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Sep 30 17:28:25 compute-1 systemd[1]: Reloading.
Sep 30 17:28:26 compute-1 systemd-rc-local-generator[31143]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:28:26 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Sep 30 17:28:26 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:28:26 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:28:26 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:29:00 compute-1 systemd[1]: Starting dnf makecache...
Sep 30 17:29:00 compute-1 dnf[31292]: Repository 'gating-repo' is missing name in configuration, using id.
Sep 30 17:29:00 compute-1 dnf[31292]: Metadata cache refreshed recently.
Sep 30 17:29:00 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 17:29:00 compute-1 systemd[1]: Finished dnf makecache.
Sep 30 17:29:27 compute-1 kernel: SELinux:  Converting 2714 SID table entries...
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:29:27 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:29:27 compute-1 sshd-session[31384]: Invalid user sol from 45.148.10.240 port 39918
Sep 30 17:29:27 compute-1 sshd-session[31384]: Connection closed by invalid user sol 45.148.10.240 port 39918 [preauth]
Sep 30 17:29:28 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Sep 30 17:29:28 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:29:28 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:29:28 compute-1 systemd[1]: Reloading.
Sep 30 17:29:28 compute-1 systemd-rc-local-generator[31481]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:29:28 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:29:28 compute-1 systemd[1]: Starting PackageKit Daemon...
Sep 30 17:29:28 compute-1 PackageKit[31697]: daemon start
Sep 30 17:29:28 compute-1 systemd[1]: Started PackageKit Daemon.
Sep 30 17:29:28 compute-1 sudo[30863]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:29 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:29:29 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:29:29 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.267s CPU time.
Sep 30 17:29:29 compute-1 systemd[1]: run-r88979c90d4b149c692b23e5dee5ce955.service: Deactivated successfully.
Sep 30 17:29:37 compute-1 sudo[32396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckbtmjajcltzpverowkbqpenrqogymou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253376.8525524-292-91595153521252/AnsiballZ_command.py'
Sep 30 17:29:37 compute-1 sudo[32396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:37 compute-1 python3.9[32398]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:29:38 compute-1 sudo[32396]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:39 compute-1 sudo[32677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzbcptejqpsvywqfqbicbjfqalkmdmxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253378.599015-308-76543561002416/AnsiballZ_selinux.py'
Sep 30 17:29:39 compute-1 sudo[32677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:39 compute-1 python3.9[32679]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 17:29:39 compute-1 sudo[32677]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:40 compute-1 sudo[32829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzehxrfofjfcqbuznnipvqevkbgrofqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253379.9392653-330-191353274388848/AnsiballZ_command.py'
Sep 30 17:29:40 compute-1 sudo[32829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:40 compute-1 python3.9[32831]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 17:29:41 compute-1 sudo[32829]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:41 compute-1 sudo[32982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwwhvscdkqhkcnuttlzulgsazkvuyjlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253381.5891037-346-249970424704354/AnsiballZ_file.py'
Sep 30 17:29:41 compute-1 sudo[32982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:42 compute-1 python3.9[32984]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:29:42 compute-1 sudo[32982]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:44 compute-1 sudo[33134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpzvprlrhuzcwubgefrofmqsogirvhoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253384.1537464-362-1233160105705/AnsiballZ_mount.py'
Sep 30 17:29:44 compute-1 sudo[33134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:44 compute-1 python3.9[33136]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 17:29:44 compute-1 sudo[33134]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:45 compute-1 sudo[33286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tohzigluitaddhywzcnnvasqsczuslqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253385.5628488-418-197355126479589/AnsiballZ_file.py'
Sep 30 17:29:45 compute-1 sudo[33286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:47 compute-1 python3.9[33288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:29:47 compute-1 sudo[33286]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:48 compute-1 sudo[33438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkvgotyvbbgxbzpjxfjzkfoyptplcois ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253387.9170218-434-143838116842885/AnsiballZ_stat.py'
Sep 30 17:29:48 compute-1 sudo[33438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:48 compute-1 python3.9[33440]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:29:48 compute-1 sudo[33438]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:48 compute-1 sudo[33561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aybjnwmzblfcnppklqluxmgzxklqlhrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253387.9170218-434-143838116842885/AnsiballZ_copy.py'
Sep 30 17:29:48 compute-1 sudo[33561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:48 compute-1 python3.9[33563]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253387.9170218-434-143838116842885/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:29:48 compute-1 sudo[33561]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:52 compute-1 sudo[33713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybawjviktiizbrbhpayuxmtbancmutt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253391.951527-488-134721876676731/AnsiballZ_getent.py'
Sep 30 17:29:52 compute-1 sudo[33713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:52 compute-1 python3.9[33715]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 17:29:52 compute-1 sudo[33713]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:53 compute-1 sudo[33866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vziwqlqjzgkasbmdptetdrrfkiikjidj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253393.0132732-504-171529340015172/AnsiballZ_group.py'
Sep 30 17:29:53 compute-1 sudo[33866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:53 compute-1 python3.9[33868]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:29:53 compute-1 groupadd[33869]: group added to /etc/group: name=qemu, GID=107
Sep 30 17:29:53 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:29:53 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:29:53 compute-1 groupadd[33869]: group added to /etc/gshadow: name=qemu
Sep 30 17:29:53 compute-1 groupadd[33869]: new group: name=qemu, GID=107
Sep 30 17:29:53 compute-1 sudo[33866]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:54 compute-1 sudo[34025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhgbqpadtlfdypcqcufndqxjcblelxue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253393.995686-520-222122257258512/AnsiballZ_user.py'
Sep 30 17:29:54 compute-1 sudo[34025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:54 compute-1 python3.9[34027]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 17:29:54 compute-1 useradd[34029]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 17:29:54 compute-1 sudo[34025]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:55 compute-1 sudo[34185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbhufdwhpsbxlkupogjqxerpxnkivziw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253395.0891993-536-170724220326597/AnsiballZ_getent.py'
Sep 30 17:29:55 compute-1 sudo[34185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:55 compute-1 python3.9[34187]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 17:29:55 compute-1 sudo[34185]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:56 compute-1 sudo[34338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vicosqmbozqtiqotzrvvetcbslmuhhgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253395.750954-552-75125864249006/AnsiballZ_group.py'
Sep 30 17:29:56 compute-1 sudo[34338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:56 compute-1 python3.9[34340]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:29:56 compute-1 groupadd[34341]: group added to /etc/group: name=hugetlbfs, GID=42477
Sep 30 17:29:56 compute-1 groupadd[34341]: group added to /etc/gshadow: name=hugetlbfs
Sep 30 17:29:56 compute-1 groupadd[34341]: new group: name=hugetlbfs, GID=42477
Sep 30 17:29:56 compute-1 sudo[34338]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:56 compute-1 sudo[34496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyrfddzhfoonkabqqyhxprgcbkgsqrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253396.5467093-570-1994371355816/AnsiballZ_file.py'
Sep 30 17:29:56 compute-1 sudo[34496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:57 compute-1 python3.9[34498]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 17:29:57 compute-1 sudo[34496]: pam_unix(sudo:session): session closed for user root
Sep 30 17:29:57 compute-1 sudo[34648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnfqwheiqjurtqaebjtwewomovhpshrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253397.427451-592-121713574956008/AnsiballZ_dnf.py'
Sep 30 17:29:57 compute-1 sudo[34648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:29:57 compute-1 python3.9[34650]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:29:59 compute-1 sudo[34648]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:00 compute-1 sudo[34801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxxwziumlgibbvzizagzzikyyqsoqhxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253399.8567677-608-29597359289241/AnsiballZ_file.py'
Sep 30 17:30:00 compute-1 sudo[34801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:00 compute-1 python3.9[34803]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:30:00 compute-1 sudo[34801]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:00 compute-1 sudo[34953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puthzylabepfebndjdqzovfqqecjvgcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253400.5157962-624-246211579460072/AnsiballZ_stat.py'
Sep 30 17:30:00 compute-1 sudo[34953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:01 compute-1 python3.9[34955]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:30:01 compute-1 sudo[34953]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:01 compute-1 sudo[35076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxqpcgydvobiandiylhyjkrvmgvlmgxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253400.5157962-624-246211579460072/AnsiballZ_copy.py'
Sep 30 17:30:01 compute-1 sudo[35076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:01 compute-1 python3.9[35078]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759253400.5157962-624-246211579460072/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:30:01 compute-1 sudo[35076]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:02 compute-1 sudo[35228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xltgfwjkbeyoxohpudphilzjvrylmyet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253401.8195927-654-77287061442135/AnsiballZ_systemd.py'
Sep 30 17:30:02 compute-1 sudo[35228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:02 compute-1 python3.9[35230]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:30:02 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 17:30:02 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Sep 30 17:30:02 compute-1 kernel: Bridge firewalling registered
Sep 30 17:30:02 compute-1 systemd-modules-load[35234]: Inserted module 'br_netfilter'
Sep 30 17:30:02 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 17:30:02 compute-1 sudo[35228]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:03 compute-1 sudo[35387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilssexytiseqljpmbihptqdhxnbevuyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253403.0427148-670-276350211816060/AnsiballZ_stat.py'
Sep 30 17:30:03 compute-1 sudo[35387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:03 compute-1 python3.9[35389]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:30:03 compute-1 sudo[35387]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:03 compute-1 sudo[35510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjaefacckilscaesyaoqqufftaidzqtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253403.0427148-670-276350211816060/AnsiballZ_copy.py'
Sep 30 17:30:03 compute-1 sudo[35510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:04 compute-1 python3.9[35512]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759253403.0427148-670-276350211816060/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:30:04 compute-1 sudo[35510]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:04 compute-1 sudo[35662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaszfcnwazgrqestmzyizunowbfuobhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253404.528917-706-268729822692793/AnsiballZ_dnf.py'
Sep 30 17:30:04 compute-1 sudo[35662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:05 compute-1 python3.9[35664]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:30:08 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:30:08 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:30:08 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:30:08 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:30:08 compute-1 systemd[1]: Reloading.
Sep 30 17:30:08 compute-1 systemd-rc-local-generator[35723]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:30:08 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:30:09 compute-1 sudo[35662]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:11 compute-1 python3.9[38139]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:30:12 compute-1 python3.9[39208]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 17:30:12 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:30:12 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:30:12 compute-1 systemd[1]: man-db-cache-update.service: Consumed 4.792s CPU time.
Sep 30 17:30:12 compute-1 systemd[1]: run-r59f8ab8fc5844b48a4bfc42e91b3f731.service: Deactivated successfully.
Sep 30 17:30:12 compute-1 python3.9[39683]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:30:13 compute-1 sudo[39833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpzypshhxmiwitzipwvuyzhwffisogbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253413.134522-784-26300339537911/AnsiballZ_command.py'
Sep 30 17:30:13 compute-1 sudo[39833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:13 compute-1 python3.9[39835]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:30:13 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 17:30:14 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 17:30:14 compute-1 sudo[39833]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:14 compute-1 sudo[40206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsogngbatyecesmqbuxcqokmmyijvuwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253414.3435771-802-278100554229984/AnsiballZ_systemd.py'
Sep 30 17:30:14 compute-1 sudo[40206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:14 compute-1 python3.9[40208]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:30:14 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 17:30:14 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 17:30:14 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 17:30:15 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 17:30:15 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 17:30:15 compute-1 sudo[40206]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:15 compute-1 python3.9[40369]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 17:30:18 compute-1 sudo[40519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsctojvpjdifmftibhksowcmnmxfirqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253418.2783606-916-23090190734938/AnsiballZ_systemd.py'
Sep 30 17:30:18 compute-1 sudo[40519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:18 compute-1 python3.9[40521]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:30:18 compute-1 systemd[1]: Reloading.
Sep 30 17:30:19 compute-1 systemd-rc-local-generator[40551]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:30:19 compute-1 sudo[40519]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:19 compute-1 sudo[40708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogfsibpqndrftnweeknkpkgmkyjxeflk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253419.3200183-916-249630588134600/AnsiballZ_systemd.py'
Sep 30 17:30:19 compute-1 sudo[40708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:19 compute-1 python3.9[40710]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:30:20 compute-1 systemd[1]: Reloading.
Sep 30 17:30:20 compute-1 systemd-rc-local-generator[40740]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:30:20 compute-1 sudo[40708]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:20 compute-1 sudo[40897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkslzfxfoheuzfipmevjllxahymxmzff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253420.5792336-948-110604745040695/AnsiballZ_command.py'
Sep 30 17:30:20 compute-1 sudo[40897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:21 compute-1 python3.9[40899]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:30:21 compute-1 sudo[40897]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:21 compute-1 sudo[41050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dejfxcoydnwtznithpkjjjecwnqmbrqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253421.356815-964-8288203585040/AnsiballZ_command.py'
Sep 30 17:30:21 compute-1 sudo[41050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:21 compute-1 python3.9[41052]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:30:21 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Sep 30 17:30:21 compute-1 sudo[41050]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:22 compute-1 sudo[41203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlncogujdpytifnyjoyicsirumbtpvqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253422.218023-980-137444865495563/AnsiballZ_command.py'
Sep 30 17:30:22 compute-1 sudo[41203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:22 compute-1 python3.9[41205]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:30:24 compute-1 sudo[41203]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:24 compute-1 sudo[41365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diyaoncggerbkvfpqheydgjdabmxgqjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253424.365583-996-191688393069316/AnsiballZ_command.py'
Sep 30 17:30:24 compute-1 sudo[41365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:24 compute-1 python3.9[41367]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:30:24 compute-1 sudo[41365]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:25 compute-1 sudo[41518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsfewrdbmblszaktfiigxezcmtauzvvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253425.0992959-1012-22188098264781/AnsiballZ_systemd.py'
Sep 30 17:30:25 compute-1 sudo[41518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:25 compute-1 python3.9[41520]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:30:25 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 17:30:25 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Sep 30 17:30:25 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Sep 30 17:30:25 compute-1 systemd[1]: Starting Apply Kernel Variables...
Sep 30 17:30:25 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 17:30:25 compute-1 systemd[1]: Finished Apply Kernel Variables.
Sep 30 17:30:25 compute-1 sudo[41518]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:26 compute-1 sshd-session[28550]: Connection closed by 192.168.122.30 port 37660
Sep 30 17:30:26 compute-1 sshd-session[28547]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:30:26 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Sep 30 17:30:26 compute-1 systemd[1]: session-10.scope: Consumed 2min 10.162s CPU time.
Sep 30 17:30:26 compute-1 systemd-logind[789]: Session 10 logged out. Waiting for processes to exit.
Sep 30 17:30:26 compute-1 systemd-logind[789]: Removed session 10.
Sep 30 17:30:32 compute-1 sshd-session[41550]: Accepted publickey for zuul from 192.168.122.30 port 52274 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:30:32 compute-1 systemd-logind[789]: New session 11 of user zuul.
Sep 30 17:30:32 compute-1 systemd[1]: Started Session 11 of User zuul.
Sep 30 17:30:32 compute-1 sshd-session[41550]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:30:33 compute-1 python3.9[41703]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:30:34 compute-1 sudo[41857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukodpkexqxbyksacheqxdgajumpvukpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253434.315914-53-42710362505435/AnsiballZ_getent.py'
Sep 30 17:30:34 compute-1 sudo[41857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:34 compute-1 python3.9[41859]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 17:30:34 compute-1 sudo[41857]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:35 compute-1 sudo[42010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcgtqysgfstlotyqskyndxuiuueakrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253435.1138957-69-215885791141217/AnsiballZ_group.py'
Sep 30 17:30:35 compute-1 sudo[42010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:35 compute-1 python3.9[42012]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:30:35 compute-1 groupadd[42013]: group added to /etc/group: name=openvswitch, GID=42476
Sep 30 17:30:35 compute-1 groupadd[42013]: group added to /etc/gshadow: name=openvswitch
Sep 30 17:30:35 compute-1 groupadd[42013]: new group: name=openvswitch, GID=42476
Sep 30 17:30:35 compute-1 sudo[42010]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:36 compute-1 sudo[42168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ottavdbljxoayeajqqjwfdjubzyvkpcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253435.928859-85-241538715012924/AnsiballZ_user.py'
Sep 30 17:30:36 compute-1 sudo[42168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:36 compute-1 python3.9[42170]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 17:30:36 compute-1 useradd[42172]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 17:30:36 compute-1 useradd[42172]: add 'openvswitch' to group 'hugetlbfs'
Sep 30 17:30:36 compute-1 useradd[42172]: add 'openvswitch' to shadow group 'hugetlbfs'
Sep 30 17:30:36 compute-1 sudo[42168]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:37 compute-1 sudo[42328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgdvbfmykgplwjbgxshaerkbkbwkihov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253437.187551-105-22064451328392/AnsiballZ_setup.py'
Sep 30 17:30:37 compute-1 sudo[42328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:37 compute-1 python3.9[42330]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:30:38 compute-1 sudo[42328]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:38 compute-1 sudo[42412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzvwgpdlnquywtdpzvnsjxjnzeqdldve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253437.187551-105-22064451328392/AnsiballZ_dnf.py'
Sep 30 17:30:38 compute-1 sudo[42412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:38 compute-1 python3.9[42414]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 17:30:40 compute-1 sudo[42412]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:41 compute-1 sudo[42575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqzwwvruwdbuldhttiksygmfwsubpwgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253441.2500408-133-47473574602757/AnsiballZ_dnf.py'
Sep 30 17:30:41 compute-1 sudo[42575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:41 compute-1 python3.9[42577]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:30:52 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:30:52 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:30:52 compute-1 groupadd[42600]: group added to /etc/group: name=unbound, GID=993
Sep 30 17:30:52 compute-1 groupadd[42600]: group added to /etc/gshadow: name=unbound
Sep 30 17:30:52 compute-1 groupadd[42600]: new group: name=unbound, GID=993
Sep 30 17:30:52 compute-1 useradd[42607]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Sep 30 17:30:53 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Sep 30 17:30:53 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Sep 30 17:30:54 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:30:54 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:30:54 compute-1 systemd[1]: Reloading.
Sep 30 17:30:54 compute-1 systemd-rc-local-generator[43104]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:30:54 compute-1 systemd-sysv-generator[43107]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:30:54 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:30:55 compute-1 sudo[42575]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:55 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:30:55 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:30:55 compute-1 systemd[1]: run-r3369593ea73b4099aa039b88c5f05b0d.service: Deactivated successfully.
Sep 30 17:30:55 compute-1 sudo[43677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzkishwmajiiccpwfhxqpgmfgivmhuww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253455.3407388-149-88150910508515/AnsiballZ_systemd.py'
Sep 30 17:30:55 compute-1 sudo[43677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:56 compute-1 python3.9[43679]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:30:56 compute-1 systemd[1]: Reloading.
Sep 30 17:30:56 compute-1 systemd-rc-local-generator[43709]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:30:56 compute-1 systemd-sysv-generator[43714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:30:56 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Sep 30 17:30:56 compute-1 chown[43721]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Sep 30 17:30:56 compute-1 ovs-ctl[43727]: /etc/openvswitch/conf.db does not exist ... (warning).
Sep 30 17:30:56 compute-1 ovs-ctl[43727]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Sep 30 17:30:56 compute-1 ovs-ctl[43727]: Starting ovsdb-server [  OK  ]
Sep 30 17:30:56 compute-1 ovs-vsctl[43776]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Sep 30 17:30:56 compute-1 ovs-vsctl[43796]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"81ab3fff-d6d4-4262-9f24-1b212876e52c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Sep 30 17:30:56 compute-1 ovs-ctl[43727]: Configuring Open vSwitch system IDs [  OK  ]
Sep 30 17:30:57 compute-1 ovs-vsctl[43802]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Sep 30 17:30:57 compute-1 ovs-ctl[43727]: Enabling remote OVSDB managers [  OK  ]
Sep 30 17:30:57 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Sep 30 17:30:57 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Sep 30 17:30:57 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Sep 30 17:30:57 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Sep 30 17:30:57 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Sep 30 17:30:57 compute-1 ovs-ctl[43847]: Inserting openvswitch module [  OK  ]
Sep 30 17:30:57 compute-1 ovs-ctl[43816]: Starting ovs-vswitchd [  OK  ]
Sep 30 17:30:57 compute-1 ovs-vsctl[43864]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Sep 30 17:30:57 compute-1 ovs-ctl[43816]: Enabling remote OVSDB managers [  OK  ]
Sep 30 17:30:57 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Sep 30 17:30:57 compute-1 systemd[1]: Starting Open vSwitch...
Sep 30 17:30:57 compute-1 systemd[1]: Finished Open vSwitch.
Sep 30 17:30:57 compute-1 sudo[43677]: pam_unix(sudo:session): session closed for user root
Sep 30 17:30:58 compute-1 python3.9[44016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:30:58 compute-1 sudo[44166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wblvhwydrmqcpjtnflivpdndkhtyipvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253458.502285-185-101210968422212/AnsiballZ_sefcontext.py'
Sep 30 17:30:58 compute-1 sudo[44166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:30:59 compute-1 python3.9[44168]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 17:31:00 compute-1 kernel: SELinux:  Converting 2738 SID table entries...
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:31:00 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:31:00 compute-1 sudo[44166]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:01 compute-1 python3.9[44323]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:31:02 compute-1 sudo[44479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjqmeoqeixomkqwmwczdbjsvcusfand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253461.772985-221-12145899179210/AnsiballZ_dnf.py'
Sep 30 17:31:02 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Sep 30 17:31:02 compute-1 sudo[44479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:02 compute-1 python3.9[44481]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:31:03 compute-1 sudo[44479]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:04 compute-1 sudo[44632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgizbwuwyllylzhgdrwgfurjsptvxjaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253463.8523512-237-30947273181610/AnsiballZ_command.py'
Sep 30 17:31:04 compute-1 sudo[44632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:04 compute-1 python3.9[44634]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:31:05 compute-1 sudo[44632]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:05 compute-1 sudo[44919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrfdynuboshwybjsguwwrvphyzbusvat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253465.4512482-253-101103187579617/AnsiballZ_file.py'
Sep 30 17:31:05 compute-1 sudo[44919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:06 compute-1 python3.9[44921]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 17:31:06 compute-1 sudo[44919]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:07 compute-1 python3.9[45071]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:31:07 compute-1 sudo[45223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcurtofdiyypxowzzihadwzhfmywcvyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253467.2612026-285-167658095691201/AnsiballZ_dnf.py'
Sep 30 17:31:07 compute-1 sudo[45223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:07 compute-1 python3.9[45225]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:31:09 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:31:09 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:31:09 compute-1 systemd[1]: Reloading.
Sep 30 17:31:09 compute-1 systemd-sysv-generator[45265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:31:09 compute-1 systemd-rc-local-generator[45261]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:31:10 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:31:10 compute-1 sudo[45223]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:31:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:31:10 compute-1 systemd[1]: run-r682116bfbae648d2855719de23771ebe.service: Deactivated successfully.
Sep 30 17:31:10 compute-1 sudo[45539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wachddzrwbnouzbniibkijrycjeaycfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253470.5327644-301-273407182669480/AnsiballZ_systemd.py'
Sep 30 17:31:10 compute-1 sudo[45539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:11 compute-1 python3.9[45541]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:31:11 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 17:31:11 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Sep 30 17:31:11 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Sep 30 17:31:11 compute-1 systemd[1]: Stopping Network Manager...
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2104] caught SIGTERM, shutting down normally.
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2117] dhcp4 (eth0): canceled DHCP transaction
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2117] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2117] dhcp4 (eth0): state changed no lease
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2119] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 17:31:11 compute-1 NetworkManager[4514]: <info>  [1759253471.2184] exiting (success)
Sep 30 17:31:11 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 17:31:11 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 17:31:11 compute-1 systemd[1]: Stopped Network Manager.
Sep 30 17:31:11 compute-1 systemd[1]: NetworkManager.service: Consumed 13.550s CPU time, 4.0M memory peak, read 0B from disk, written 31.5K to disk.
Sep 30 17:31:11 compute-1 systemd[1]: Starting Network Manager...
Sep 30 17:31:11 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.2662] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cf2a7137-0e0f-4f1a-866e-63b8011fce6c)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.2664] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.2723] manager[0x55b3986eb090]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 17:31:11 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 17:31:11 compute-1 systemd[1]: Started Hostname Service.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3837] hostname: hostname: using hostnamed
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3838] hostname: static hostname changed from (none) to "compute-1"
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3842] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3847] manager[0x55b3986eb090]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3848] manager[0x55b3986eb090]: rfkill: WWAN hardware radio set enabled
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3865] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3872] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3873] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3874] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3874] manager: Networking is enabled by state file
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3876] settings: Loaded settings plugin: keyfile (internal)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3879] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3900] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3909] dhcp: init: Using DHCP client 'internal'
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3911] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3915] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3919] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3925] device (lo): Activation: starting connection 'lo' (9129f00f-203c-42c0-b87c-17b7d284cfa5)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3930] device (eth0): carrier: link connected
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3933] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3937] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3938] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3943] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3949] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3954] device (eth1): carrier: link connected
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3957] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3960] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (c65e340a-eb16-53f4-aee4-b66cf391b2f7) (indicated)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3961] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3964] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3970] device (eth1): Activation: starting connection 'ci-private-network' (c65e340a-eb16-53f4-aee4-b66cf391b2f7)
Sep 30 17:31:11 compute-1 systemd[1]: Started Network Manager.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3976] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3983] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3985] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3987] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3988] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3990] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3993] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.3996] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4000] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4005] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4009] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4026] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4036] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4041] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4043] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4048] device (lo): Activation: successful, device activated.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4060] dhcp4 (eth0): state changed new lease, address=38.102.83.102
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4065] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4136] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4144] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4146] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4149] manager: NetworkManager state is now CONNECTED_LOCAL
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4154] device (eth1): Activation: successful, device activated.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4164] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4166] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4169] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4174] device (eth0): Activation: successful, device activated.
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4179] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 17:31:11 compute-1 NetworkManager[45549]: <info>  [1759253471.4182] manager: startup complete
Sep 30 17:31:11 compute-1 systemd[1]: Starting Network Manager Wait Online...
Sep 30 17:31:11 compute-1 systemd[1]: Finished Network Manager Wait Online.
Sep 30 17:31:11 compute-1 sudo[45539]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:11 compute-1 sudo[45765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvoyclzpfatijkewvxzsmsdwkosdieal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253471.6578948-317-159228284843421/AnsiballZ_dnf.py'
Sep 30 17:31:11 compute-1 sudo[45765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:12 compute-1 python3.9[45767]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:31:17 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:31:17 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:31:17 compute-1 systemd[1]: Reloading.
Sep 30 17:31:17 compute-1 systemd-rc-local-generator[45820]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:31:17 compute-1 systemd-sysv-generator[45823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:31:17 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:31:18 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:31:18 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:31:18 compute-1 systemd[1]: run-r0d7c70b16a674accbd681bfb72f38c2e.service: Deactivated successfully.
Sep 30 17:31:18 compute-1 sudo[45765]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:18 compute-1 sudo[46228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzukyqrojndgknsjnswkbefnfztnkrmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253478.5039005-341-63022611445332/AnsiballZ_stat.py'
Sep 30 17:31:18 compute-1 sudo[46228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:19 compute-1 python3.9[46230]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:31:19 compute-1 sudo[46228]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:19 compute-1 sudo[46380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nreearyjuwjrauoaocorsjutvxhnaylm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253479.243918-359-98687712749741/AnsiballZ_ini_file.py'
Sep 30 17:31:19 compute-1 sudo[46380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:19 compute-1 python3.9[46382]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:19 compute-1 sudo[46380]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:20 compute-1 sudo[46534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkzngbjyxslqumyuitxprvwfmbxqqwnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253480.2215154-379-111426880397932/AnsiballZ_ini_file.py'
Sep 30 17:31:20 compute-1 sudo[46534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:20 compute-1 python3.9[46536]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:20 compute-1 sudo[46534]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:21 compute-1 sudo[46686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwzpntrunrskqhvhgwbwnkgjlxhdedqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253480.9402485-379-191895377038665/AnsiballZ_ini_file.py'
Sep 30 17:31:21 compute-1 sudo[46686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:21 compute-1 python3.9[46688]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:21 compute-1 sudo[46686]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:21 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:31:21 compute-1 sudo[46838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjagrbsquhjmjdyqsilhfbpbkdanziww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253481.6255739-409-202689186104112/AnsiballZ_ini_file.py'
Sep 30 17:31:21 compute-1 sudo[46838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:22 compute-1 python3.9[46840]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:22 compute-1 sudo[46838]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:22 compute-1 sudo[46990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnxuabalonyxzplxlpjhaipxfvolmggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253482.235537-409-68941425540598/AnsiballZ_ini_file.py'
Sep 30 17:31:22 compute-1 sudo[46990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:22 compute-1 python3.9[46992]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:22 compute-1 sudo[46990]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:23 compute-1 sudo[47142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rusbmksdzgrptsdauedxqxtyqnuafwmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253482.860196-439-79946864154573/AnsiballZ_stat.py'
Sep 30 17:31:23 compute-1 sudo[47142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:23 compute-1 python3.9[47144]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:31:23 compute-1 sudo[47142]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:23 compute-1 sudo[47265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gktvceqxjvbocyqkvjgwoxzrxibykfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253482.860196-439-79946864154573/AnsiballZ_copy.py'
Sep 30 17:31:23 compute-1 sudo[47265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:23 compute-1 python3.9[47267]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253482.860196-439-79946864154573/.source _original_basename=.j5znoj26 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:23 compute-1 sudo[47265]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:24 compute-1 sudo[47417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfydeldwwycgddbaljhylekhjhsrjfpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253484.1553593-469-78341836388532/AnsiballZ_file.py'
Sep 30 17:31:24 compute-1 sudo[47417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:24 compute-1 python3.9[47419]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:24 compute-1 sudo[47417]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:25 compute-1 sudo[47569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zranqrslfculpcgvyzzndwnblyjekgqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253484.851755-485-248122289861831/AnsiballZ_edpm_os_net_config_mappings.py'
Sep 30 17:31:25 compute-1 sudo[47569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:25 compute-1 python3.9[47571]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Sep 30 17:31:25 compute-1 sudo[47569]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:25 compute-1 sudo[47721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ummcdzajiymvujlzsdedxpwyzxcclvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253485.6780198-503-256702270066460/AnsiballZ_file.py'
Sep 30 17:31:25 compute-1 sudo[47721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:26 compute-1 python3.9[47723]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:26 compute-1 sudo[47721]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:26 compute-1 sudo[47873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkzbuoolaktikpochtudmwcaxwisuyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253486.5058346-523-234892128057997/AnsiballZ_stat.py'
Sep 30 17:31:26 compute-1 sudo[47873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:26 compute-1 sudo[47873]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:27 compute-1 sudo[47996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzxjnjfteyrfrgaqrsftbolyuisyyhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253486.5058346-523-234892128057997/AnsiballZ_copy.py'
Sep 30 17:31:27 compute-1 sudo[47996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:27 compute-1 sudo[47996]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:28 compute-1 sudo[48148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slcqnevryjrqlplmfiaccryjsgzgdrpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253487.7816148-553-191289469791316/AnsiballZ_slurp.py'
Sep 30 17:31:28 compute-1 sudo[48148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:28 compute-1 python3.9[48150]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Sep 30 17:31:28 compute-1 sudo[48148]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:29 compute-1 sudo[48323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oscczvfnmbpokacpdiuiyqmneskmktxk ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253488.6222994-571-177728480124189/async_wrapper.py j183274638324 300 /home/zuul/.ansible/tmp/ansible-tmp-1759253488.6222994-571-177728480124189/AnsiballZ_edpm_os_net_config.py _'
Sep 30 17:31:29 compute-1 sudo[48323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:29 compute-1 ansible-async_wrapper.py[48325]: Invoked with j183274638324 300 /home/zuul/.ansible/tmp/ansible-tmp-1759253488.6222994-571-177728480124189/AnsiballZ_edpm_os_net_config.py _
Sep 30 17:31:29 compute-1 ansible-async_wrapper.py[48328]: Starting module and watcher
Sep 30 17:31:29 compute-1 ansible-async_wrapper.py[48328]: Start watching 48329 (300)
Sep 30 17:31:29 compute-1 ansible-async_wrapper.py[48329]: Start module (48329)
Sep 30 17:31:29 compute-1 ansible-async_wrapper.py[48325]: Return async_wrapper task started.
Sep 30 17:31:29 compute-1 sudo[48323]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:29 compute-1 python3.9[48330]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Sep 30 17:31:30 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Sep 30 17:31:30 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Sep 30 17:31:30 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Sep 30 17:31:30 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Sep 30 17:31:30 compute-1 kernel: cfg80211: failed to load regulatory.db
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.7714] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.7732] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8463] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8465] audit: op="connection-add" uuid="96f6ddb0-6be7-4070-b93f-ba419875f057" name="br-ex-br" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8485] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8487] audit: op="connection-add" uuid="06d3d52f-79be-4052-8299-d6e069f701d9" name="br-ex-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8504] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8505] audit: op="connection-add" uuid="e1df1fa1-ffbc-42c7-8095-8181bbf7a337" name="eth1-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8521] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8522] audit: op="connection-add" uuid="8f329b62-3935-44d6-bef9-b9ce793c592c" name="vlan20-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8536] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8537] audit: op="connection-add" uuid="3ff040d7-3947-424e-87b3-2f392327275d" name="vlan21-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8553] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8555] audit: op="connection-add" uuid="831f87d1-afb6-44b1-afdb-afa714b41547" name="vlan22-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8568] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8570] audit: op="connection-add" uuid="ae27ca8e-a054-4f10-b2a9-e56ac226cd77" name="vlan23-port" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8596] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8616] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8617] audit: op="connection-add" uuid="976f408d-51e0-4daa-a859-e673e2676870" name="br-ex-if" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8669] audit: op="connection-update" uuid="c65e340a-eb16-53f4-aee4-b66cf391b2f7" name="ci-private-network" args="ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.dns,ipv6.method,ipv6.addresses,ipv6.routes,ovs-external-ids.data,ipv4.routing-rules,ipv4.never-default,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.routes,connection.slave-type,connection.timestamp,connection.controller,connection.master,connection.port-type,ovs-interface.type" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8692] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8695] audit: op="connection-add" uuid="ca09fbd5-0574-4e36-9178-c949f305c65b" name="vlan20-if" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8717] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8719] audit: op="connection-add" uuid="01caf250-fe41-49e8-adfa-c962d275eed5" name="vlan21-if" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8742] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8744] audit: op="connection-add" uuid="c329098b-a7bf-472c-9c1f-49d2085afe9f" name="vlan22-if" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8767] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8768] audit: op="connection-add" uuid="1f98ebb9-a5e0-49d1-bbb6-e089a38b1589" name="vlan23-if" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8782] audit: op="connection-delete" uuid="673ac1e6-4892-3ac0-858b-84293dcaf668" name="Wired connection 1" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8796] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8809] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8813] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (96f6ddb0-6be7-4070-b93f-ba419875f057)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8814] audit: op="connection-activate" uuid="96f6ddb0-6be7-4070-b93f-ba419875f057" name="br-ex-br" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8816] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8825] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8830] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (06d3d52f-79be-4052-8299-d6e069f701d9)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8832] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8839] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8845] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (e1df1fa1-ffbc-42c7-8095-8181bbf7a337)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8848] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8856] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8860] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (8f329b62-3935-44d6-bef9-b9ce793c592c)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8862] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8871] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8875] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (3ff040d7-3947-424e-87b3-2f392327275d)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8877] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8883] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8888] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (831f87d1-afb6-44b1-afdb-afa714b41547)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8889] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8900] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8907] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (ae27ca8e-a054-4f10-b2a9-e56ac226cd77)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8908] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8911] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8913] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8922] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8928] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8936] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (976f408d-51e0-4daa-a859-e673e2676870)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8937] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8941] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8944] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8945] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8947] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8965] device (eth1): disconnecting for new activation request.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8966] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8969] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8971] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8973] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8978] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8985] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8994] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (ca09fbd5-0574-4e36-9178-c949f305c65b)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.8995] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9000] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9002] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9003] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9007] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9015] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9022] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (01caf250-fe41-49e8-adfa-c962d275eed5)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9022] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9027] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9030] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9031] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9035] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9043] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9049] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (c329098b-a7bf-472c-9c1f-49d2085afe9f)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9051] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9055] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9060] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9062] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9068] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9076] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9085] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (1f98ebb9-a5e0-49d1-bbb6-e089a38b1589)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9087] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9091] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9094] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9095] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9098] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9117] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,802-3-ethernet.mtu" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9120] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9126] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9129] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9138] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9143] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9147] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9150] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 17:31:31 compute-1 kernel: ovs-system: entered promiscuous mode
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9152] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9160] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9166] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9171] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9173] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9181] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9189] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 kernel: Timeout policy base is empty
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9193] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9196] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9203] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9209] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 systemd-udevd[48335]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9214] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9216] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9223] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9227] dhcp4 (eth0): canceled DHCP transaction
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9227] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9227] dhcp4 (eth0): state changed no lease
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9228] dhcp4 (eth0): activation: beginning transaction (no timeout)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9243] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9248] audit: op="device-reapply" interface="eth1" ifindex=3 pid=48331 uid=0 result="fail" reason="Device is not activated"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9253] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Sep 30 17:31:31 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9288] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9292] dhcp4 (eth0): state changed new lease, address=38.102.83.102
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9298] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9345] device (eth1): disconnecting for new activation request.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9347] audit: op="connection-activate" uuid="c65e340a-eb16-53f4-aee4-b66cf391b2f7" name="ci-private-network" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9348] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9453] device (eth1): Activation: starting connection 'ci-private-network' (c65e340a-eb16-53f4-aee4-b66cf391b2f7)
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9461] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9473] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9476] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9480] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9481] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9482] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9483] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9484] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9485] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9485] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9486] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=48331 uid=0 result="success"
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9488] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9492] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9495] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9498] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9500] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9502] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9506] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9508] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9512] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9514] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9517] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9520] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9523] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9525] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9528] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9532] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9535] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 kernel: br-ex: entered promiscuous mode
Sep 30 17:31:31 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9585] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9587] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9593] device (eth1): Activation: successful, device activated.
Sep 30 17:31:31 compute-1 kernel: vlan22: entered promiscuous mode
Sep 30 17:31:31 compute-1 systemd-udevd[48336]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9666] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9677] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 kernel: vlan23: entered promiscuous mode
Sep 30 17:31:31 compute-1 kernel: vlan20: entered promiscuous mode
Sep 30 17:31:31 compute-1 systemd-udevd[48337]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9757] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9761] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9768] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9781] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9803] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 kernel: vlan21: entered promiscuous mode
Sep 30 17:31:31 compute-1 systemd-udevd[48441]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9818] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9839] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9844] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9852] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9858] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9872] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9873] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9876] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9881] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9904] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9920] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9934] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9946] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9948] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9954] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9964] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9966] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 17:31:31 compute-1 NetworkManager[45549]: <info>  [1759253491.9972] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 17:31:33 compute-1 sudo[48687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjevrovsmtbjuynkjnmkeubzwyiadaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253492.6652267-571-176070094161869/AnsiballZ_async_status.py'
Sep 30 17:31:33 compute-1 sudo[48687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.1713] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 python3.9[48689]: ansible-ansible.legacy.async_status Invoked with jid=j183274638324.48325 mode=status _async_dir=/root/.ansible_async
Sep 30 17:31:33 compute-1 sudo[48687]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.4141] checkpoint[0x55b3986c2950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.4146] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.7308] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.7323] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.9341] audit: op="networking-control" arg="global-dns-configuration" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.9371] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.9399] audit: op="networking-control" arg="global-dns-configuration" pid=48331 uid=0 result="success"
Sep 30 17:31:33 compute-1 NetworkManager[45549]: <info>  [1759253493.9423] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=48331 uid=0 result="success"
Sep 30 17:31:34 compute-1 NetworkManager[45549]: <info>  [1759253494.1123] checkpoint[0x55b3986c2a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Sep 30 17:31:34 compute-1 NetworkManager[45549]: <info>  [1759253494.1132] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=48331 uid=0 result="success"
Sep 30 17:31:34 compute-1 ansible-async_wrapper.py[48329]: Module complete (48329)
Sep 30 17:31:34 compute-1 ansible-async_wrapper.py[48328]: Done in kid B.
Sep 30 17:31:36 compute-1 sudo[48793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwfpprjqaiuqfnuxekewomenbhjjcog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253492.6652267-571-176070094161869/AnsiballZ_async_status.py'
Sep 30 17:31:36 compute-1 sudo[48793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:36 compute-1 python3.9[48795]: ansible-ansible.legacy.async_status Invoked with jid=j183274638324.48325 mode=status _async_dir=/root/.ansible_async
Sep 30 17:31:36 compute-1 sudo[48793]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:37 compute-1 sudo[48893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqlbnjhrfiabljgtetcgdjrivqaeovle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253492.6652267-571-176070094161869/AnsiballZ_async_status.py'
Sep 30 17:31:37 compute-1 sudo[48893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:37 compute-1 python3.9[48895]: ansible-ansible.legacy.async_status Invoked with jid=j183274638324.48325 mode=cleanup _async_dir=/root/.ansible_async
Sep 30 17:31:37 compute-1 sudo[48893]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:37 compute-1 sudo[49045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qegjvyxtdkcsploaclaxbnikgucvgsok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253497.5433-625-17643935540626/AnsiballZ_stat.py'
Sep 30 17:31:37 compute-1 sudo[49045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:38 compute-1 python3.9[49047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:31:38 compute-1 sudo[49045]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:38 compute-1 sudo[49168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esqbkmtygtbszrdcmkazwpgufbdhouud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253497.5433-625-17643935540626/AnsiballZ_copy.py'
Sep 30 17:31:38 compute-1 sudo[49168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:38 compute-1 python3.9[49170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253497.5433-625-17643935540626/.source.returncode _original_basename=.ncgai9bx follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:38 compute-1 sudo[49168]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:39 compute-1 sudo[49320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogglfewrdhuukjxtipkquudhdxsqwrez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253499.1124144-657-25029284544730/AnsiballZ_stat.py'
Sep 30 17:31:39 compute-1 sudo[49320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:39 compute-1 python3.9[49322]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:31:39 compute-1 sudo[49320]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:40 compute-1 sudo[49443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmthddxeseiaokqbgsvyxefjqlovcjos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253499.1124144-657-25029284544730/AnsiballZ_copy.py'
Sep 30 17:31:40 compute-1 sudo[49443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:40 compute-1 python3.9[49445]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253499.1124144-657-25029284544730/.source.cfg _original_basename=.e64x2p52 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:31:40 compute-1 sudo[49443]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:40 compute-1 sudo[49596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruxdcwqimnygiqmrfrdsexjcmgbakjaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253500.3846347-687-16253309532210/AnsiballZ_systemd.py'
Sep 30 17:31:40 compute-1 sudo[49596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:40 compute-1 python3.9[49598]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:31:41 compute-1 systemd[1]: Reloading Network Manager...
Sep 30 17:31:41 compute-1 NetworkManager[45549]: <info>  [1759253501.0259] audit: op="reload" arg="0" pid=49602 uid=0 result="success"
Sep 30 17:31:41 compute-1 NetworkManager[45549]: <info>  [1759253501.0269] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Sep 30 17:31:41 compute-1 systemd[1]: Reloaded Network Manager.
Sep 30 17:31:41 compute-1 sudo[49596]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:41 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 17:31:41 compute-1 sshd-session[41553]: Connection closed by 192.168.122.30 port 52274
Sep 30 17:31:41 compute-1 sshd-session[41550]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:31:41 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Sep 30 17:31:41 compute-1 systemd[1]: session-11.scope: Consumed 51.964s CPU time.
Sep 30 17:31:41 compute-1 systemd-logind[789]: Session 11 logged out. Waiting for processes to exit.
Sep 30 17:31:41 compute-1 systemd-logind[789]: Removed session 11.
Sep 30 17:31:46 compute-1 sshd-session[49634]: Accepted publickey for zuul from 192.168.122.30 port 39002 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:31:46 compute-1 systemd-logind[789]: New session 12 of user zuul.
Sep 30 17:31:46 compute-1 systemd[1]: Started Session 12 of User zuul.
Sep 30 17:31:46 compute-1 sshd-session[49634]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:31:47 compute-1 python3.9[49788]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:31:48 compute-1 python3.9[49942]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:31:49 compute-1 python3.9[50135]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:31:50 compute-1 sshd-session[49638]: Connection closed by 192.168.122.30 port 39002
Sep 30 17:31:50 compute-1 sshd-session[49634]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:31:50 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Sep 30 17:31:50 compute-1 systemd[1]: session-12.scope: Consumed 2.397s CPU time.
Sep 30 17:31:50 compute-1 systemd-logind[789]: Session 12 logged out. Waiting for processes to exit.
Sep 30 17:31:50 compute-1 systemd-logind[789]: Removed session 12.
Sep 30 17:31:51 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:31:55 compute-1 sshd-session[50165]: Accepted publickey for zuul from 192.168.122.30 port 59666 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:31:55 compute-1 systemd-logind[789]: New session 13 of user zuul.
Sep 30 17:31:55 compute-1 systemd[1]: Started Session 13 of User zuul.
Sep 30 17:31:55 compute-1 sshd-session[50165]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:31:56 compute-1 python3.9[50319]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:31:57 compute-1 python3.9[50473]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:31:58 compute-1 sudo[50627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbsthuaghdbsrpfdviyshmmsjuvuaobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253518.2179317-61-220628648676156/AnsiballZ_setup.py'
Sep 30 17:31:58 compute-1 sudo[50627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:58 compute-1 python3.9[50629]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:31:59 compute-1 sudo[50627]: pam_unix(sudo:session): session closed for user root
Sep 30 17:31:59 compute-1 sudo[50712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqninedmmkppfwmxvzelxibzmnotbqcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253518.2179317-61-220628648676156/AnsiballZ_dnf.py'
Sep 30 17:31:59 compute-1 sudo[50712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:31:59 compute-1 python3.9[50714]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:32:00 compute-1 sudo[50712]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:01 compute-1 sudo[50865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxupjwamwakzqzefmssglzshzrhqpcbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253521.035883-85-249430607465731/AnsiballZ_setup.py'
Sep 30 17:32:01 compute-1 sudo[50865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:01 compute-1 python3.9[50867]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:32:01 compute-1 sudo[50865]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:02 compute-1 sudo[51061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiujshjsgjseubsorhhthvrrwfixzvja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253522.216916-107-117022180392300/AnsiballZ_file.py'
Sep 30 17:32:02 compute-1 sudo[51061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:02 compute-1 python3.9[51063]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:02 compute-1 sudo[51061]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:03 compute-1 sudo[51213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdguvcufmhqxutmlvwmslfcgnxajtrms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253523.005658-123-267783226717895/AnsiballZ_command.py'
Sep 30 17:32:03 compute-1 sudo[51213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:03 compute-1 python3.9[51215]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:32:03 compute-1 podman[51216]: 2025-09-30 17:32:03.811504123 +0000 UTC m=+0.072268957 system refresh
Sep 30 17:32:03 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:32:03 compute-1 sudo[51213]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:04 compute-1 sudo[51376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hifaeunwhgwnpbsuweiqfjmsmgyjsjsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253524.0489273-139-207419248476658/AnsiballZ_stat.py'
Sep 30 17:32:04 compute-1 sudo[51376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:04 compute-1 python3.9[51378]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:04 compute-1 sudo[51376]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:05 compute-1 sudo[51499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-engqhtllyejevdkghtmsvgpzelkmsqka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253524.0489273-139-207419248476658/AnsiballZ_copy.py'
Sep 30 17:32:05 compute-1 sudo[51499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:05 compute-1 python3.9[51501]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253524.0489273-139-207419248476658/.source.json follow=False _original_basename=podman_network_config.j2 checksum=2201051c4d1100fec070b3e863781d053e383426 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:05 compute-1 sudo[51499]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:05 compute-1 sudo[51651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txdwsnvrbcxqxswfgtubacdzssidbuwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253525.6270921-169-59384391900410/AnsiballZ_stat.py'
Sep 30 17:32:05 compute-1 sudo[51651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:06 compute-1 python3.9[51653]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:06 compute-1 sudo[51651]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:06 compute-1 sudo[51774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkjgifiosmgjfiafoudjqgqvumicwcko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253525.6270921-169-59384391900410/AnsiballZ_copy.py'
Sep 30 17:32:06 compute-1 sudo[51774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:06 compute-1 python3.9[51776]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759253525.6270921-169-59384391900410/.source.conf follow=False _original_basename=registries.conf.j2 checksum=648aa3ef81d3efca9d74ba4d007f7d21b2d62a41 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:06 compute-1 sudo[51774]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:07 compute-1 sudo[51926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhushwyxcrxeeomoizkljtxzwszgdlxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253526.9842942-201-27098084290750/AnsiballZ_ini_file.py'
Sep 30 17:32:07 compute-1 sudo[51926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:07 compute-1 python3.9[51928]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:07 compute-1 sudo[51926]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:08 compute-1 sudo[52078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdnfroyohffhqoxlgefqxmzivatuzem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253527.797158-201-196976716861726/AnsiballZ_ini_file.py'
Sep 30 17:32:08 compute-1 sudo[52078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:08 compute-1 python3.9[52080]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:08 compute-1 sudo[52078]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:08 compute-1 sudo[52230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrdskljdunpdcwybzlnuylnencyirinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253528.415425-201-97468933738352/AnsiballZ_ini_file.py'
Sep 30 17:32:08 compute-1 sudo[52230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:08 compute-1 python3.9[52232]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:08 compute-1 sudo[52230]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:09 compute-1 sudo[52382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhpqqcusvorgumwoxffzabxwwgzwqauj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253529.0869477-201-107633788518972/AnsiballZ_ini_file.py'
Sep 30 17:32:09 compute-1 sudo[52382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:09 compute-1 python3.9[52384]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:09 compute-1 sudo[52382]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:10 compute-1 sudo[52534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsobtcznbuaeuykgxwsgigiueuicnvpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253529.767763-263-90233410531403/AnsiballZ_dnf.py'
Sep 30 17:32:10 compute-1 sudo[52534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:10 compute-1 python3.9[52536]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:32:11 compute-1 sudo[52534]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:12 compute-1 sudo[52687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voxggsyzaksxetqnhjefybusctnsrgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253531.9613442-285-106697530281561/AnsiballZ_setup.py'
Sep 30 17:32:12 compute-1 sudo[52687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:12 compute-1 python3.9[52689]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:32:12 compute-1 sudo[52687]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:13 compute-1 sudo[52841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waqczskkmpeegqovbwpkdwngsfzjknot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253532.8093016-301-126018591180270/AnsiballZ_stat.py'
Sep 30 17:32:13 compute-1 sudo[52841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:13 compute-1 python3.9[52843]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:32:13 compute-1 sudo[52841]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:13 compute-1 sudo[52993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmarpnntswxcolxygffvmzlfnxzljitb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253533.497362-319-45254034978618/AnsiballZ_stat.py'
Sep 30 17:32:13 compute-1 sudo[52993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:13 compute-1 python3.9[52995]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:32:14 compute-1 sudo[52993]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:14 compute-1 sudo[53145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpsiccjmastaffvpsdlgoptddvrqktuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253534.304347-339-168689444725128/AnsiballZ_service_facts.py'
Sep 30 17:32:14 compute-1 sudo[53145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:14 compute-1 python3.9[53147]: ansible-service_facts Invoked
Sep 30 17:32:15 compute-1 network[53164]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:32:15 compute-1 network[53165]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:32:15 compute-1 network[53166]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:32:18 compute-1 sudo[53145]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:19 compute-1 sudo[53451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygrgpgtohmyturfvomrymcwabkrwzffl ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759253539.5911295-365-145069492224672/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759253539.5911295-365-145069492224672/args'
Sep 30 17:32:19 compute-1 sudo[53451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:20 compute-1 sudo[53451]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:20 compute-1 sudo[53618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpdrwfygrybjguswskrjdekwmfpptljt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253540.3336775-387-152532135027672/AnsiballZ_dnf.py'
Sep 30 17:32:20 compute-1 sudo[53618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:20 compute-1 python3.9[53620]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:32:22 compute-1 sudo[53618]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:23 compute-1 sudo[53771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czyqquipvnvrtrksqqcqduaicqliotzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253542.5396173-413-255703077251269/AnsiballZ_package_facts.py'
Sep 30 17:32:23 compute-1 sudo[53771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:23 compute-1 python3.9[53773]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 17:32:23 compute-1 sudo[53771]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:24 compute-1 sudo[53923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idmbxpohszvcqohinnpjrgxbjzumjpxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253544.303852-434-131251516358174/AnsiballZ_stat.py'
Sep 30 17:32:24 compute-1 sudo[53923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:24 compute-1 python3.9[53925]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:24 compute-1 sudo[53923]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:25 compute-1 sudo[54048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqnnqqjgxfynfpdsmrwdwytcbcumttox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253544.303852-434-131251516358174/AnsiballZ_copy.py'
Sep 30 17:32:25 compute-1 sudo[54048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:25 compute-1 python3.9[54050]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253544.303852-434-131251516358174/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:25 compute-1 sudo[54048]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:26 compute-1 sudo[54202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcftmzttbpgxbbhcvfuzuqgqlslkwryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253545.798457-464-231925114214745/AnsiballZ_stat.py'
Sep 30 17:32:26 compute-1 sudo[54202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:26 compute-1 python3.9[54204]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:26 compute-1 sudo[54202]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:26 compute-1 sudo[54327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mipbllexlrrjtcncnehuuizjblhmasqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253545.798457-464-231925114214745/AnsiballZ_copy.py'
Sep 30 17:32:26 compute-1 sudo[54327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:27 compute-1 python3.9[54329]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253545.798457-464-231925114214745/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:27 compute-1 sudo[54327]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:28 compute-1 sudo[54481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abdhzmixkuolexbflenajokrvfymrzdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253547.7101696-506-253356171276149/AnsiballZ_lineinfile.py'
Sep 30 17:32:28 compute-1 sudo[54481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:28 compute-1 python3.9[54483]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:28 compute-1 sudo[54481]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:29 compute-1 sudo[54635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfbvpiqywmfonrzusyzmwajniyhbeuwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253549.2844546-536-86047753067211/AnsiballZ_setup.py'
Sep 30 17:32:29 compute-1 sudo[54635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:29 compute-1 python3.9[54637]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:32:30 compute-1 sudo[54635]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:30 compute-1 sudo[54719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmckagmfwvcnkkmqmonqnmknfwkzsrub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253549.2844546-536-86047753067211/AnsiballZ_systemd.py'
Sep 30 17:32:30 compute-1 sudo[54719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:30 compute-1 python3.9[54721]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:32:31 compute-1 sudo[54719]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:31 compute-1 sudo[54873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnyyscnupcovbihmostpdvnpbaoyngrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253551.620159-567-5914961248223/AnsiballZ_setup.py'
Sep 30 17:32:31 compute-1 sudo[54873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:32 compute-1 python3.9[54875]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:32:32 compute-1 sudo[54873]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:32 compute-1 sudo[54957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huydbwwhfzwunhodujwtbtigplnzprtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253551.620159-567-5914961248223/AnsiballZ_systemd.py'
Sep 30 17:32:32 compute-1 sudo[54957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:32 compute-1 python3.9[54959]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:32:32 compute-1 chronyd[803]: chronyd exiting
Sep 30 17:32:32 compute-1 systemd[1]: Stopping NTP client/server...
Sep 30 17:32:32 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Sep 30 17:32:32 compute-1 systemd[1]: Stopped NTP client/server.
Sep 30 17:32:33 compute-1 systemd[1]: Starting NTP client/server...
Sep 30 17:32:33 compute-1 chronyd[54967]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 17:32:33 compute-1 chronyd[54967]: Frequency -27.234 +/- 0.117 ppm read from /var/lib/chrony/drift
Sep 30 17:32:33 compute-1 chronyd[54967]: Loaded seccomp filter (level 2)
Sep 30 17:32:33 compute-1 systemd[1]: Started NTP client/server.
Sep 30 17:32:33 compute-1 sudo[54957]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:33 compute-1 sshd-session[50168]: Connection closed by 192.168.122.30 port 59666
Sep 30 17:32:33 compute-1 sshd-session[50165]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:32:33 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Sep 30 17:32:33 compute-1 systemd[1]: session-13.scope: Consumed 26.491s CPU time.
Sep 30 17:32:33 compute-1 systemd-logind[789]: Session 13 logged out. Waiting for processes to exit.
Sep 30 17:32:33 compute-1 systemd-logind[789]: Removed session 13.
Sep 30 17:32:39 compute-1 sshd-session[54993]: Accepted publickey for zuul from 192.168.122.30 port 48418 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:32:39 compute-1 systemd-logind[789]: New session 14 of user zuul.
Sep 30 17:32:39 compute-1 systemd[1]: Started Session 14 of User zuul.
Sep 30 17:32:39 compute-1 sshd-session[54993]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:32:40 compute-1 sudo[55146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drmuiavbujliyndhicizkeqjocdubely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253559.756447-25-22197840482813/AnsiballZ_file.py'
Sep 30 17:32:40 compute-1 sudo[55146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:40 compute-1 python3.9[55148]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:40 compute-1 sudo[55146]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:41 compute-1 sudo[55298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztambtudylbzcjyqnzkdnqpbsfwoewws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253560.6457517-49-208090985921908/AnsiballZ_stat.py'
Sep 30 17:32:41 compute-1 sudo[55298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:41 compute-1 python3.9[55300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:41 compute-1 sudo[55298]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:41 compute-1 sudo[55421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smukzipwvutwamcadowssuofocdjnwhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253560.6457517-49-208090985921908/AnsiballZ_copy.py'
Sep 30 17:32:41 compute-1 sudo[55421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:41 compute-1 python3.9[55423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253560.6457517-49-208090985921908/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:41 compute-1 sudo[55421]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:42 compute-1 sshd-session[54996]: Connection closed by 192.168.122.30 port 48418
Sep 30 17:32:42 compute-1 sshd-session[54993]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:32:42 compute-1 systemd-logind[789]: Session 14 logged out. Waiting for processes to exit.
Sep 30 17:32:42 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Sep 30 17:32:42 compute-1 systemd[1]: session-14.scope: Consumed 1.784s CPU time.
Sep 30 17:32:42 compute-1 systemd-logind[789]: Removed session 14.
Sep 30 17:32:48 compute-1 sshd-session[55448]: Accepted publickey for zuul from 192.168.122.30 port 58094 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:32:48 compute-1 systemd-logind[789]: New session 15 of user zuul.
Sep 30 17:32:48 compute-1 systemd[1]: Started Session 15 of User zuul.
Sep 30 17:32:48 compute-1 sshd-session[55448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:32:49 compute-1 python3.9[55601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:32:50 compute-1 sudo[55755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltijzcuwbtabkbytwxgantimluwptngd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253569.8998368-47-241909272022898/AnsiballZ_file.py'
Sep 30 17:32:50 compute-1 sudo[55755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:50 compute-1 python3.9[55757]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:50 compute-1 sudo[55755]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:51 compute-1 sudo[55930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqrjhzdbydojbpzldsbslyvhtabfftle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253570.690409-63-169080274715595/AnsiballZ_stat.py'
Sep 30 17:32:51 compute-1 sudo[55930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:51 compute-1 python3.9[55932]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:51 compute-1 sudo[55930]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:52 compute-1 sudo[56053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozmpdtmjubcmuiwtkgizoycwlmluctav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253570.690409-63-169080274715595/AnsiballZ_copy.py'
Sep 30 17:32:52 compute-1 sudo[56053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:52 compute-1 python3.9[56055]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759253570.690409-63-169080274715595/.source.json _original_basename=.eqo_dzd6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:52 compute-1 sudo[56053]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:53 compute-1 sudo[56205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttrntulqcxwysepovdujmnauijfcrls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253573.1567924-109-24466966245157/AnsiballZ_stat.py'
Sep 30 17:32:53 compute-1 sudo[56205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:53 compute-1 python3.9[56207]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:53 compute-1 sudo[56205]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:53 compute-1 sudo[56328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzhvjwaykelfcibwwfjhfjvwewqbtzxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253573.1567924-109-24466966245157/AnsiballZ_copy.py'
Sep 30 17:32:53 compute-1 sudo[56328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:54 compute-1 python3.9[56330]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253573.1567924-109-24466966245157/.source _original_basename=.k92e4hwb follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:54 compute-1 sudo[56328]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:54 compute-1 sudo[56480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ninzfynojraypkhjpvfpbjyffqvowknp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253574.4601157-141-4997018481052/AnsiballZ_file.py'
Sep 30 17:32:54 compute-1 sudo[56480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:55 compute-1 python3.9[56482]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:55 compute-1 sudo[56480]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:55 compute-1 sudo[56632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbxeumfxjvxtzwxbngkdbtzfaumzoizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253575.2826936-157-131508034942868/AnsiballZ_stat.py'
Sep 30 17:32:55 compute-1 sudo[56632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:55 compute-1 python3.9[56634]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:55 compute-1 sudo[56632]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:56 compute-1 sudo[56755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflljrcpcvdyovuuptwzfpvwlfnvpjpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253575.2826936-157-131508034942868/AnsiballZ_copy.py'
Sep 30 17:32:56 compute-1 sudo[56755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:56 compute-1 python3.9[56757]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759253575.2826936-157-131508034942868/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:56 compute-1 sudo[56755]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:56 compute-1 sudo[56907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdbhgsgsleppkdiyoqavkmhswiptbwtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253576.5902812-157-118697166110367/AnsiballZ_stat.py'
Sep 30 17:32:56 compute-1 sudo[56907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:57 compute-1 python3.9[56909]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:57 compute-1 sudo[56907]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:57 compute-1 sudo[57030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrdvwexeuaedusfmebnwcpsdqoutsnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253576.5902812-157-118697166110367/AnsiballZ_copy.py'
Sep 30 17:32:57 compute-1 sudo[57030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:57 compute-1 python3.9[57032]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759253576.5902812-157-118697166110367/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:32:57 compute-1 sudo[57030]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:58 compute-1 sudo[57182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtstzwiirbtomnjdjeujlisvfebqhqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253578.0114973-215-137549812098944/AnsiballZ_file.py'
Sep 30 17:32:58 compute-1 sudo[57182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:58 compute-1 python3.9[57184]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:58 compute-1 sudo[57182]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:59 compute-1 sudo[57334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edhgusnmevotjhnykcgkwudnvqdlalcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253578.798591-231-187728509154406/AnsiballZ_stat.py'
Sep 30 17:32:59 compute-1 sudo[57334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:59 compute-1 python3.9[57336]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:32:59 compute-1 sudo[57334]: pam_unix(sudo:session): session closed for user root
Sep 30 17:32:59 compute-1 sudo[57457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkdrfcrkorflmvuklaewnoachlaehtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253578.798591-231-187728509154406/AnsiballZ_copy.py'
Sep 30 17:32:59 compute-1 sudo[57457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:32:59 compute-1 python3.9[57459]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253578.798591-231-187728509154406/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:32:59 compute-1 sudo[57457]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:00 compute-1 sudo[57609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbpyhuqbrziqnsersyfimqdiahibcfhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253580.0776358-261-42081022295926/AnsiballZ_stat.py'
Sep 30 17:33:00 compute-1 sudo[57609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:00 compute-1 python3.9[57611]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:33:00 compute-1 sudo[57609]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:00 compute-1 sudo[57732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khijelogofpoyzmxyblyxmmhzujhxwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253580.0776358-261-42081022295926/AnsiballZ_copy.py'
Sep 30 17:33:00 compute-1 sudo[57732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:01 compute-1 python3.9[57734]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253580.0776358-261-42081022295926/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:01 compute-1 sudo[57732]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:02 compute-1 sudo[57884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqkprxmrefpdqahlasjveaogwwflxwoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253581.4147618-291-114267230764679/AnsiballZ_systemd.py'
Sep 30 17:33:02 compute-1 sudo[57884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:02 compute-1 python3.9[57886]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:33:02 compute-1 systemd[1]: Reloading.
Sep 30 17:33:02 compute-1 systemd-sysv-generator[57916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:02 compute-1 systemd-rc-local-generator[57913]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:02 compute-1 systemd[1]: Reloading.
Sep 30 17:33:02 compute-1 systemd-sysv-generator[57952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:02 compute-1 systemd-rc-local-generator[57948]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:02 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Sep 30 17:33:02 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Sep 30 17:33:02 compute-1 sudo[57884]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:03 compute-1 sudo[58111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqixbojssuhqqwfhcvpbzomfawlekowx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253583.165517-307-88895521045490/AnsiballZ_stat.py'
Sep 30 17:33:03 compute-1 sudo[58111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:03 compute-1 python3.9[58113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:33:03 compute-1 sudo[58111]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:03 compute-1 sudo[58234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yldghjgwmuqhmhhqiouybilyitathleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253583.165517-307-88895521045490/AnsiballZ_copy.py'
Sep 30 17:33:03 compute-1 sudo[58234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:04 compute-1 python3.9[58236]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253583.165517-307-88895521045490/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:04 compute-1 sudo[58234]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:04 compute-1 sudo[58386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svtwsbidimnqelflzxbovsataoccnmpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253584.381191-337-64622191534683/AnsiballZ_stat.py'
Sep 30 17:33:04 compute-1 sudo[58386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:04 compute-1 python3.9[58388]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:33:04 compute-1 sudo[58386]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:05 compute-1 sudo[58509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wydltxrqcowgbyesqjjjkwdjemqrxtji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253584.381191-337-64622191534683/AnsiballZ_copy.py'
Sep 30 17:33:05 compute-1 sudo[58509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:05 compute-1 python3.9[58511]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253584.381191-337-64622191534683/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:05 compute-1 sudo[58509]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:06 compute-1 sudo[58661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbvqpwmjdcsbejvopfifieyoypqxgtrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253585.745588-367-84828765051457/AnsiballZ_systemd.py'
Sep 30 17:33:06 compute-1 sudo[58661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:06 compute-1 python3.9[58663]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:33:06 compute-1 systemd[1]: Reloading.
Sep 30 17:33:06 compute-1 systemd-rc-local-generator[58692]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:06 compute-1 systemd-sysv-generator[58697]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:06 compute-1 systemd[1]: Reloading.
Sep 30 17:33:06 compute-1 systemd-rc-local-generator[58729]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:06 compute-1 systemd-sysv-generator[58732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:06 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:33:06 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:33:06 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:33:06 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:33:07 compute-1 sudo[58661]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:07 compute-1 python3.9[58889]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:33:07 compute-1 network[58906]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:33:07 compute-1 network[58907]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:33:07 compute-1 network[58908]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:33:11 compute-1 sudo[59170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyyyopxmsissnprfamwbuphhtijzwvtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253590.9644704-399-97000132364418/AnsiballZ_systemd.py'
Sep 30 17:33:11 compute-1 sudo[59170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:11 compute-1 python3.9[59172]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:33:11 compute-1 systemd[1]: Reloading.
Sep 30 17:33:11 compute-1 systemd-rc-local-generator[59202]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:11 compute-1 systemd-sysv-generator[59205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:12 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Sep 30 17:33:12 compute-1 iptables.init[59212]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Sep 30 17:33:12 compute-1 iptables.init[59212]: iptables: Flushing firewall rules: [  OK  ]
Sep 30 17:33:12 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Sep 30 17:33:12 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Sep 30 17:33:12 compute-1 sudo[59170]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:12 compute-1 sudo[59406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-metulzubotsmzkdqhmdynuvynhljiwhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253592.5409136-399-29201327922344/AnsiballZ_systemd.py'
Sep 30 17:33:12 compute-1 sudo[59406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:13 compute-1 python3.9[59408]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:33:13 compute-1 sudo[59406]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:13 compute-1 sudo[59560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poropniumzxynfufgwvjpwoeiyyxcsuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253593.5704708-431-31935279727257/AnsiballZ_systemd.py'
Sep 30 17:33:13 compute-1 sudo[59560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:14 compute-1 python3.9[59562]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:33:14 compute-1 systemd[1]: Reloading.
Sep 30 17:33:14 compute-1 systemd-rc-local-generator[59593]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:33:14 compute-1 systemd-sysv-generator[59596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:33:14 compute-1 systemd[1]: Starting Netfilter Tables...
Sep 30 17:33:14 compute-1 systemd[1]: Finished Netfilter Tables.
Sep 30 17:33:14 compute-1 sudo[59560]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:15 compute-1 sudo[59753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gudzlnmblzpmvefqnzdihincblpcmmog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253594.8808322-447-9153050959804/AnsiballZ_command.py'
Sep 30 17:33:15 compute-1 sudo[59753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:15 compute-1 python3.9[59755]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:33:15 compute-1 sudo[59753]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:16 compute-1 sudo[59906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntwvvxkzbqluhhwknrjtgunsidljqkrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253595.9937835-475-197586374187333/AnsiballZ_stat.py'
Sep 30 17:33:16 compute-1 sudo[59906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:16 compute-1 python3.9[59908]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:33:16 compute-1 sudo[59906]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:16 compute-1 sudo[60031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsjlfpmhutbuldgqcynkwayigrsddfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253595.9937835-475-197586374187333/AnsiballZ_copy.py'
Sep 30 17:33:16 compute-1 sudo[60031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:17 compute-1 python3.9[60033]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253595.9937835-475-197586374187333/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:17 compute-1 sudo[60031]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:17 compute-1 python3.9[60184]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:33:17 compute-1 polkitd[6874]: Registered Authentication Agent for unix-process:60186:1143953 (system bus name :1.530 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 17:33:42 compute-1 polkitd[6874]: Unregistered Authentication Agent for unix-process:60186:1143953 (system bus name :1.530, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 17:33:42 compute-1 polkit-agent-helper-1[60198]: pam_unix(polkit-1:auth): conversation failed
Sep 30 17:33:42 compute-1 polkit-agent-helper-1[60198]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Sep 30 17:33:42 compute-1 polkitd[6874]: Operator of unix-process:60186:1143953 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.529 [<unknown>] (owned by unix-user:zuul)
Sep 30 17:33:43 compute-1 sshd-session[55451]: Connection closed by 192.168.122.30 port 58094
Sep 30 17:33:43 compute-1 sshd-session[55448]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:33:43 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Sep 30 17:33:43 compute-1 systemd[1]: session-15.scope: Consumed 20.641s CPU time.
Sep 30 17:33:43 compute-1 systemd-logind[789]: Session 15 logged out. Waiting for processes to exit.
Sep 30 17:33:43 compute-1 systemd-logind[789]: Removed session 15.
Sep 30 17:33:56 compute-1 sshd-session[60224]: Accepted publickey for zuul from 192.168.122.30 port 49050 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:33:56 compute-1 systemd-logind[789]: New session 16 of user zuul.
Sep 30 17:33:56 compute-1 systemd[1]: Started Session 16 of User zuul.
Sep 30 17:33:56 compute-1 sshd-session[60224]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:33:57 compute-1 python3.9[60377]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:33:58 compute-1 sudo[60531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onylxuxrhplmpbwddfysjeocubjcvdhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253637.8527186-47-260548515726966/AnsiballZ_file.py'
Sep 30 17:33:58 compute-1 sudo[60531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:58 compute-1 python3.9[60533]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:58 compute-1 sudo[60531]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:59 compute-1 sudo[60706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfsobvjzimnjngppihaaawaprvezeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253638.684376-63-66989665136446/AnsiballZ_stat.py'
Sep 30 17:33:59 compute-1 sudo[60706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:59 compute-1 python3.9[60708]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:33:59 compute-1 sudo[60706]: pam_unix(sudo:session): session closed for user root
Sep 30 17:33:59 compute-1 sudo[60784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ervuxvhrspvzpvyubrorqaicixuyuxtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253638.684376-63-66989665136446/AnsiballZ_file.py'
Sep 30 17:33:59 compute-1 sudo[60784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:33:59 compute-1 python3.9[60786]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.tgiigxes recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:33:59 compute-1 sudo[60784]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:00 compute-1 sudo[60936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grbwoyfuirwummoxibnoanvjalcagzsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253640.2745774-103-196188898037455/AnsiballZ_stat.py'
Sep 30 17:34:00 compute-1 sudo[60936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:00 compute-1 python3.9[60938]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:00 compute-1 sudo[60936]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:01 compute-1 sudo[61014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cukdwguhzjipfvckdqgkdchvzukvrpbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253640.2745774-103-196188898037455/AnsiballZ_file.py'
Sep 30 17:34:01 compute-1 sudo[61014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:01 compute-1 python3.9[61016]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.8v_vz29v recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:01 compute-1 sudo[61014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:01 compute-1 sudo[61166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eargwlcjvlrkvcxbjjucjliubcngodxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253641.5342932-129-233448783703439/AnsiballZ_file.py'
Sep 30 17:34:01 compute-1 sudo[61166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:02 compute-1 python3.9[61168]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:34:02 compute-1 sudo[61166]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:02 compute-1 sudo[61318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqhyyvslxbweftawqviylehjlgukubot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253642.2645504-145-32755314781479/AnsiballZ_stat.py'
Sep 30 17:34:02 compute-1 sudo[61318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:02 compute-1 python3.9[61320]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:02 compute-1 sudo[61318]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:03 compute-1 sudo[61396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndaxvuldzwpqjqvuzmejopakhfyzaoux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253642.2645504-145-32755314781479/AnsiballZ_file.py'
Sep 30 17:34:03 compute-1 sudo[61396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:03 compute-1 python3.9[61398]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:34:03 compute-1 sudo[61396]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:03 compute-1 sudo[61548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onhlrvyljkqinkoagyujsjxnnwubfkmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253643.4349005-145-76931565220392/AnsiballZ_stat.py'
Sep 30 17:34:03 compute-1 sudo[61548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:03 compute-1 python3.9[61550]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:04 compute-1 sudo[61548]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:04 compute-1 sudo[61626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-titzgosxfksvvpwnfbsdjlhffflwfncl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253643.4349005-145-76931565220392/AnsiballZ_file.py'
Sep 30 17:34:04 compute-1 sudo[61626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:04 compute-1 python3.9[61628]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:34:04 compute-1 sudo[61626]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:04 compute-1 sudo[61778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iirxeyjzsgldddrhfbaybegwwlpdgynj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253644.626583-191-107397679075666/AnsiballZ_file.py'
Sep 30 17:34:04 compute-1 sudo[61778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:05 compute-1 python3.9[61780]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:05 compute-1 sudo[61778]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:05 compute-1 sudo[61930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cthdefljntuqmwkjuiemhosgkpszzoye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253645.3555555-207-247696597594146/AnsiballZ_stat.py'
Sep 30 17:34:05 compute-1 sudo[61930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:05 compute-1 python3.9[61932]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:05 compute-1 sudo[61930]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:06 compute-1 sudo[62008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuyylekdbjmbasyoqihqsgmzzjyeter ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253645.3555555-207-247696597594146/AnsiballZ_file.py'
Sep 30 17:34:06 compute-1 sudo[62008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:06 compute-1 python3.9[62010]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:06 compute-1 sudo[62008]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:07 compute-1 sudo[62160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwcuvpyjqdusccbskmuzrxmeovzepfqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253646.7096338-231-28626385955544/AnsiballZ_stat.py'
Sep 30 17:34:07 compute-1 sudo[62160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:07 compute-1 python3.9[62162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:07 compute-1 sudo[62160]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:07 compute-1 sudo[62238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynrwttfznuqgjpffxbjxrselforjypdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253646.7096338-231-28626385955544/AnsiballZ_file.py'
Sep 30 17:34:07 compute-1 sudo[62238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:07 compute-1 python3.9[62240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:07 compute-1 sudo[62238]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:08 compute-1 sudo[62390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfobrqkwddzgrhqqdjbccijadbzklzwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253648.0987833-255-191076708970209/AnsiballZ_systemd.py'
Sep 30 17:34:08 compute-1 sudo[62390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:09 compute-1 python3.9[62392]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:34:09 compute-1 systemd[1]: Reloading.
Sep 30 17:34:09 compute-1 systemd-rc-local-generator[62421]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:34:09 compute-1 systemd-sysv-generator[62425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:34:09 compute-1 sudo[62390]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:09 compute-1 sudo[62580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caoemslzqwlugecalxzrzedxcwtwaios ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253649.5529332-271-105397954789949/AnsiballZ_stat.py'
Sep 30 17:34:09 compute-1 sudo[62580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:10 compute-1 python3.9[62582]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:10 compute-1 sudo[62580]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:10 compute-1 sudo[62658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpqonnbgkgtjqkurqhyrmltbhltfkvpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253649.5529332-271-105397954789949/AnsiballZ_file.py'
Sep 30 17:34:10 compute-1 sudo[62658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:10 compute-1 python3.9[62660]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:10 compute-1 sudo[62658]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:11 compute-1 sudo[62810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddvsfzzyugxwrywyytydprzsuojakcef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253650.8364506-295-6451762628066/AnsiballZ_stat.py'
Sep 30 17:34:11 compute-1 sudo[62810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:11 compute-1 python3.9[62812]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:11 compute-1 sudo[62810]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:11 compute-1 sudo[62888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohncfyedixkknxxfvjfbnullyrclghp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253650.8364506-295-6451762628066/AnsiballZ_file.py'
Sep 30 17:34:11 compute-1 sudo[62888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:11 compute-1 python3.9[62890]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:11 compute-1 sudo[62888]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:12 compute-1 sudo[63040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnekikktpgctoywmqvoskwiwsizhxkkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253652.1341474-319-148926228534270/AnsiballZ_systemd.py'
Sep 30 17:34:12 compute-1 sudo[63040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:12 compute-1 python3.9[63042]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:34:12 compute-1 systemd[1]: Reloading.
Sep 30 17:34:12 compute-1 systemd-rc-local-generator[63073]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:34:12 compute-1 systemd-sysv-generator[63076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:34:13 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:34:13 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:34:13 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:34:13 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:34:13 compute-1 sudo[63040]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:13 compute-1 python3.9[63237]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:34:14 compute-1 network[63254]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:34:14 compute-1 network[63255]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:34:14 compute-1 network[63256]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:34:19 compute-1 sudo[63517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syqsbjmewrpeuhhlfbyudtopsrwtffhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253659.160448-371-1015570317850/AnsiballZ_stat.py'
Sep 30 17:34:19 compute-1 sudo[63517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:19 compute-1 python3.9[63519]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:19 compute-1 sudo[63517]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:20 compute-1 sudo[63595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbhhdapwwokjsqkvejtcpiwvmpjkrhrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253659.160448-371-1015570317850/AnsiballZ_file.py'
Sep 30 17:34:20 compute-1 sudo[63595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:20 compute-1 python3.9[63597]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:20 compute-1 sudo[63595]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:21 compute-1 sudo[63747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjkprdgoxwgcnmhecpgtpnvyngbrjrny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253660.6601434-398-102169743116628/AnsiballZ_file.py'
Sep 30 17:34:21 compute-1 sudo[63747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:21 compute-1 python3.9[63749]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:21 compute-1 sudo[63747]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:21 compute-1 sudo[63899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kidyaquzkclkabhmlqrzlbucpesesnzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253661.4901044-413-136781568498009/AnsiballZ_stat.py'
Sep 30 17:34:21 compute-1 sudo[63899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:22 compute-1 python3.9[63901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:22 compute-1 sudo[63899]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:22 compute-1 sudo[64022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpsjwhcqfbwsfqrbqzquwofqylchastu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253661.4901044-413-136781568498009/AnsiballZ_copy.py'
Sep 30 17:34:22 compute-1 sudo[64022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:22 compute-1 python3.9[64024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253661.4901044-413-136781568498009/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:22 compute-1 sudo[64022]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:23 compute-1 sudo[64174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkdwzacohdhmwltzawuurhihlkudhbkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253663.0684373-449-220144666882477/AnsiballZ_timezone.py'
Sep 30 17:34:23 compute-1 sudo[64174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:23 compute-1 python3.9[64176]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 17:34:23 compute-1 systemd[1]: Starting Time & Date Service...
Sep 30 17:34:23 compute-1 systemd[1]: Started Time & Date Service.
Sep 30 17:34:23 compute-1 sudo[64174]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:24 compute-1 sudo[64330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlsqmbhvpnkqjfwudrzdfictcybnqctj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253664.2151673-467-139827157615346/AnsiballZ_file.py'
Sep 30 17:34:24 compute-1 sudo[64330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:24 compute-1 python3.9[64332]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:24 compute-1 sudo[64330]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:25 compute-1 sudo[64482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjqeuuxxvtmfpxwgtfmcbbepeklrceny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253664.940967-483-259748546716398/AnsiballZ_stat.py'
Sep 30 17:34:25 compute-1 sudo[64482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:25 compute-1 python3.9[64484]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:25 compute-1 sudo[64482]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:25 compute-1 sudo[64605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zovzgdcxxpylexrqrtdsmfefmomzupmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253664.940967-483-259748546716398/AnsiballZ_copy.py'
Sep 30 17:34:25 compute-1 sudo[64605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:26 compute-1 python3.9[64607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253664.940967-483-259748546716398/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:26 compute-1 sudo[64605]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:26 compute-1 sudo[64757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwsechusrvfkeqxuerbuclszxkmfbfsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253666.314608-513-50831891581403/AnsiballZ_stat.py'
Sep 30 17:34:26 compute-1 sudo[64757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:26 compute-1 python3.9[64759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:26 compute-1 sudo[64757]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:27 compute-1 sudo[64880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soidfthbupfpkzuyuaaaqzeighmeffjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253666.314608-513-50831891581403/AnsiballZ_copy.py'
Sep 30 17:34:27 compute-1 sudo[64880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:27 compute-1 python3.9[64882]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759253666.314608-513-50831891581403/.source.yaml _original_basename=.eoxcx_w1 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:27 compute-1 sudo[64880]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:28 compute-1 sudo[65032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxqvumtpbkwchcyomepovafxqjcyvivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253667.8464673-543-97827856440717/AnsiballZ_stat.py'
Sep 30 17:34:28 compute-1 sudo[65032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:28 compute-1 python3.9[65034]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:28 compute-1 sudo[65032]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:28 compute-1 sudo[65155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvfgrpjcmlcmmcjlhjjpxhtdbdjpjdvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253667.8464673-543-97827856440717/AnsiballZ_copy.py'
Sep 30 17:34:28 compute-1 sudo[65155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:29 compute-1 python3.9[65157]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253667.8464673-543-97827856440717/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:29 compute-1 sudo[65155]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:29 compute-1 sudo[65307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpigxnmbacspzfensfjzqehjjgcncpnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253669.2692885-573-171939409484183/AnsiballZ_command.py'
Sep 30 17:34:29 compute-1 sudo[65307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:29 compute-1 python3.9[65309]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:34:29 compute-1 sudo[65307]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:30 compute-1 sudo[65460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pauciavthbjqagnkshyomckvqlvwjcxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253670.1442554-589-17244228764987/AnsiballZ_command.py'
Sep 30 17:34:30 compute-1 sudo[65460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:30 compute-1 python3.9[65462]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:34:30 compute-1 sudo[65460]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:31 compute-1 sudo[65613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yftfrjmxtqszomvylkuidgkymommrwml ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759253670.8787394-605-185479894453277/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 17:34:31 compute-1 sudo[65613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:31 compute-1 python3[65615]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 17:34:31 compute-1 sudo[65613]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:32 compute-1 sudo[65765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwhjbtyxopjobjccuwjckweqwhcsbmsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253671.8285093-621-40854264710868/AnsiballZ_stat.py'
Sep 30 17:34:32 compute-1 sudo[65765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:32 compute-1 python3.9[65767]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:32 compute-1 sudo[65765]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:32 compute-1 sudo[65888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puorlyynwnduahbiqhilpyqaqflzvykc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253671.8285093-621-40854264710868/AnsiballZ_copy.py'
Sep 30 17:34:32 compute-1 sudo[65888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:33 compute-1 python3.9[65890]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253671.8285093-621-40854264710868/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:33 compute-1 sudo[65888]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:33 compute-1 sudo[66040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiwskcsbqcuymoayvaumudknwwtdhzzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253673.3120158-653-146051050277931/AnsiballZ_stat.py'
Sep 30 17:34:33 compute-1 sudo[66040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:33 compute-1 python3.9[66042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:33 compute-1 sudo[66040]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:34 compute-1 sudo[66163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghhqlclkvoawrrthkhyfwptwdqknjoup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253673.3120158-653-146051050277931/AnsiballZ_copy.py'
Sep 30 17:34:34 compute-1 sudo[66163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:34 compute-1 python3.9[66165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253673.3120158-653-146051050277931/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:34 compute-1 sudo[66163]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:34 compute-1 sudo[66315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayyazmhwafaalpyltlfcwptqwqvtmldw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253674.554406-681-130099402974921/AnsiballZ_stat.py'
Sep 30 17:34:34 compute-1 sudo[66315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:35 compute-1 python3.9[66317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:35 compute-1 sudo[66315]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:35 compute-1 sudo[66438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvmudghocailfdnlucqdacezqapffipg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253674.554406-681-130099402974921/AnsiballZ_copy.py'
Sep 30 17:34:35 compute-1 sudo[66438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:35 compute-1 python3.9[66440]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253674.554406-681-130099402974921/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:35 compute-1 sudo[66438]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:36 compute-1 sudo[66590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmpiqppccubtkdlyejbszvoojigpzpxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253676.021261-711-265095369636427/AnsiballZ_stat.py'
Sep 30 17:34:36 compute-1 sudo[66590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:36 compute-1 python3.9[66592]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:36 compute-1 sudo[66590]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:37 compute-1 sudo[66713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aysxgidmvcudcrmljznemorzpjuczbyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253676.021261-711-265095369636427/AnsiballZ_copy.py'
Sep 30 17:34:37 compute-1 sudo[66713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:37 compute-1 python3.9[66715]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253676.021261-711-265095369636427/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:37 compute-1 sudo[66713]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:37 compute-1 sudo[66865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-favsapkyxsncadyvnsnvuqdykpixnqbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253677.4446795-741-165240264325913/AnsiballZ_stat.py'
Sep 30 17:34:37 compute-1 sudo[66865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:38 compute-1 python3.9[66867]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:34:38 compute-1 sudo[66865]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:38 compute-1 sudo[66988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eixmaclvvlsvilzdrrjmgcclhqlfcwys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253677.4446795-741-165240264325913/AnsiballZ_copy.py'
Sep 30 17:34:38 compute-1 sudo[66988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:38 compute-1 python3.9[66990]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759253677.4446795-741-165240264325913/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:38 compute-1 sudo[66988]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:39 compute-1 sudo[67140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjbjhczhewefoxurzxoqiwijgotlswip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253678.8218448-771-109260985450458/AnsiballZ_file.py'
Sep 30 17:34:39 compute-1 sudo[67140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:39 compute-1 python3.9[67142]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:39 compute-1 sudo[67140]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:39 compute-1 sudo[67292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmsulxmriikzmbxsjrwoikrhyxulwwvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253679.4828248-787-147419895634160/AnsiballZ_command.py'
Sep 30 17:34:39 compute-1 sudo[67292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:39 compute-1 python3.9[67294]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:34:40 compute-1 sudo[67292]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:40 compute-1 sudo[67451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnudlqoeicdxhlceyiogjionhajmhiyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253680.281865-804-156037104591674/AnsiballZ_blockinfile.py'
Sep 30 17:34:40 compute-1 sudo[67451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:40 compute-1 python3.9[67453]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:40 compute-1 sudo[67451]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:41 compute-1 sudo[67604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eefyiqgcmibavrfmwhagacqbvgvywrls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253681.248762-821-239236040044494/AnsiballZ_file.py'
Sep 30 17:34:41 compute-1 sudo[67604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:41 compute-1 python3.9[67606]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:41 compute-1 sudo[67604]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:42 compute-1 sudo[67756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hturtbtoemwmbzpihtbqpxzgdhswjrfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253681.9447334-821-141654357575356/AnsiballZ_file.py'
Sep 30 17:34:42 compute-1 sudo[67756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:42 compute-1 python3.9[67758]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:42 compute-1 sudo[67756]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:42 compute-1 chronyd[54967]: Selected source 162.159.200.1 (pool.ntp.org)
Sep 30 17:34:43 compute-1 sudo[67908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtrsurucldqnqxkdxylwpapzajjptku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253682.682595-851-89164567716634/AnsiballZ_mount.py'
Sep 30 17:34:43 compute-1 sudo[67908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:43 compute-1 python3.9[67910]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 17:34:43 compute-1 sudo[67908]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:43 compute-1 sudo[68061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mynsgurauctnxikjffrgsugkchmufzpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253683.5788982-851-36919548250853/AnsiballZ_mount.py'
Sep 30 17:34:43 compute-1 sudo[68061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:44 compute-1 python3.9[68063]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 17:34:44 compute-1 sudo[68061]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:44 compute-1 sshd-session[60227]: Connection closed by 192.168.122.30 port 49050
Sep 30 17:34:44 compute-1 sshd-session[60224]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:34:44 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Sep 30 17:34:44 compute-1 systemd[1]: session-16.scope: Consumed 37.298s CPU time.
Sep 30 17:34:44 compute-1 systemd-logind[789]: Session 16 logged out. Waiting for processes to exit.
Sep 30 17:34:44 compute-1 systemd-logind[789]: Removed session 16.
Sep 30 17:34:50 compute-1 sshd-session[68089]: Accepted publickey for zuul from 192.168.122.30 port 42848 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:34:50 compute-1 systemd-logind[789]: New session 17 of user zuul.
Sep 30 17:34:50 compute-1 systemd[1]: Started Session 17 of User zuul.
Sep 30 17:34:50 compute-1 sshd-session[68089]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:34:50 compute-1 sudo[68242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rusoicvmiwzmempxnbrihontwxwuyxus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253690.1709583-18-205518712582437/AnsiballZ_tempfile.py'
Sep 30 17:34:50 compute-1 sudo[68242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:50 compute-1 python3.9[68244]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 17:34:50 compute-1 sudo[68242]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:51 compute-1 sudo[68394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmdwgaeazqkmmnucozeuumnliqrzvyby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253690.9985468-42-108591801962769/AnsiballZ_stat.py'
Sep 30 17:34:51 compute-1 sudo[68394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:51 compute-1 python3.9[68396]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:34:51 compute-1 sudo[68394]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:52 compute-1 sudo[68546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcsrouafznmyiwksqyqkedhlhhbkicjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253692.003351-62-99407049831417/AnsiballZ_setup.py'
Sep 30 17:34:52 compute-1 sudo[68546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:52 compute-1 python3.9[68548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:34:52 compute-1 sudo[68546]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:53 compute-1 sudo[68698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjryeymlmlbriioyhlvzomkggzccuxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253693.0668519-79-131723589911651/AnsiballZ_blockinfile.py'
Sep 30 17:34:53 compute-1 sudo[68698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:53 compute-1 python3.9[68700]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCc3938ID3mnGjsgZen6kQCNM5mkVWqANJocuRy3sXOMN63dtyjhBgL73dvNVc4/MHyzxPDQuzK8tshXHOcqwYvNyllWa9UhEuAdhcNXcRKSELVxmBLRWZx/tsxp7Ws5/jqm87BYWYBOH23DCI96hjzPNZvDj8g24u1gnFFIDlGQELa7bj3YLXw2mWWadQeLxX35z9zMP39YZLf/2F8zAFy27zfi5U4Ni1I6YXvTL+DNwg7Ulluud4fY+sf3ds4pU5htK63pEPYw1f4eI/82wYgnmmEjUqBXGUraTbHG7EoY0kg8bnebUO02l1uSbV+YM/5LNKomXhUy/kUhb9l1uqNuqXvimRH4xVgJ9Mn0cJ2WGhlnkU1gqx0p1FNE01EWx7Spbz4uwVESHAmr67aymcw0Da5R+P9sI5lMqVNJHUeQiAq9bA3X9EbU9oIBIzoZCm7x5N8UpcvzrK0tNMaVLymDnsI8Rkc1MJpuTboQqnsrWs1q2SxaKY2vfqidEBk+Xs=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILy+MaglT2Cqq/Z1fTckQQdU2y2eh3D3Okv7pfMd4ZvV
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKG527neZJvIIF0UdmoBKFMIwvlh64Ua1Pir0KM8tM5Fy8tZbjiOY/Dz3agm+i5OWkd7fXEaYOfPR/rFSi9+L8s=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/5i5Z2213BrqRzlkXUPKi2V6ft/sZowq7ddp+Dq/QqjnkediXByZsJmLLkuA+smrhUZwo5vyubq2HeUmZ3U1EenWRQdC11cJP4ll9+UV0iP2vlc4cMh+DV62ujsM8T15I5/7JnPBcvrGrTJTmpQSQoCm2yD2q/v4Kx2V27sLj8ZlX64zDSBOYy+KhjhBuUM4gEbyrRzO2PqNsMeDrDGr3QFiyGNe8qS2KHmuEa4QFJnumNPJrxYBdTjcsKMZOeuVw2a33JPia0kDgKtaNDV7Izq8h9DlidYk1/aPo6MhfwzYDkRUaKSVhqM1oEDQWc40AK7EX4S00KLr5Nix8bd2nqEZsbD5lk/6wKNR1xdutyZt0GcnOEVJB7+VWN6Y3COvwe9Q1GSKCAhMthkn0Vd9ZvrwiFVKpMUyWD1b74vjHcDu8UOcJlVoqol0jJYEqDCy6mRh0l4Q2PfmyFpVMJ1ib1hV4dPIfzJIkuON6jMedqsKPGZnio8U1E/EMWBlaVn8=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICeWw7E9xsgcxKn1cBOcDfvvFIX4M5Blc+gMQNI96O43
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG/CY8xOSIOy2V9qTWOkLlPGEg36qW1s4MO9P37ZVKfdA8ded8m++iIKGFCGxQiTUk0W+13bPq0LIPsJgw+4osM=
                                             create=True mode=0644 path=/tmp/ansible.zx7asoyg state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:53 compute-1 sshd-session[68701]: Connection closed by authenticating user root 167.71.248.239 port 47706 [preauth]
Sep 30 17:34:53 compute-1 sudo[68698]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:53 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 17:34:54 compute-1 sudo[68855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbpltkystbocbgenctljfapnnghpkyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253693.8545084-95-54371657216657/AnsiballZ_command.py'
Sep 30 17:34:54 compute-1 sudo[68855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:54 compute-1 python3.9[68857]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.zx7asoyg' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:34:54 compute-1 sudo[68855]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:55 compute-1 sudo[69009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhmtyassdqzekdhuqlfjnubbjluvbvzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253694.620898-111-27352599063844/AnsiballZ_file.py'
Sep 30 17:34:55 compute-1 sudo[69009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:34:55 compute-1 python3.9[69011]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.zx7asoyg state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:34:55 compute-1 sudo[69009]: pam_unix(sudo:session): session closed for user root
Sep 30 17:34:55 compute-1 sshd-session[68092]: Connection closed by 192.168.122.30 port 42848
Sep 30 17:34:55 compute-1 sshd-session[68089]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:34:55 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Sep 30 17:34:55 compute-1 systemd[1]: session-17.scope: Consumed 3.449s CPU time.
Sep 30 17:34:55 compute-1 systemd-logind[789]: Session 17 logged out. Waiting for processes to exit.
Sep 30 17:34:55 compute-1 systemd-logind[789]: Removed session 17.
Sep 30 17:35:01 compute-1 sshd-session[69036]: Accepted publickey for zuul from 192.168.122.30 port 58072 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:35:01 compute-1 systemd-logind[789]: New session 18 of user zuul.
Sep 30 17:35:01 compute-1 systemd[1]: Started Session 18 of User zuul.
Sep 30 17:35:01 compute-1 sshd-session[69036]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:35:02 compute-1 python3.9[69189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:35:03 compute-1 sudo[69343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fksbwxnhfcmsgmzzpdghpzjwbmcgqibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253703.043771-45-23708815106340/AnsiballZ_systemd.py'
Sep 30 17:35:03 compute-1 sudo[69343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:03 compute-1 python3.9[69345]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 17:35:03 compute-1 sudo[69343]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:04 compute-1 sudo[69497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agheyxwukovcpdztulwcjrhwtrpplqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253704.1630886-61-239020116596334/AnsiballZ_systemd.py'
Sep 30 17:35:04 compute-1 sudo[69497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:04 compute-1 python3.9[69499]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:35:04 compute-1 sudo[69497]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:05 compute-1 sudo[69650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugadegtlvyglseqbfcdsoywspvtcmazj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253705.0077271-79-135484875166320/AnsiballZ_command.py'
Sep 30 17:35:05 compute-1 sudo[69650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:05 compute-1 python3.9[69652]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:35:05 compute-1 sudo[69650]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:06 compute-1 sudo[69803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crhhszutmfeiqdcumohmameovzmsvlkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253705.883464-95-131063355083518/AnsiballZ_stat.py'
Sep 30 17:35:06 compute-1 sudo[69803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:06 compute-1 python3.9[69805]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:35:06 compute-1 sudo[69803]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:06 compute-1 sudo[69957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gztvywbinqzdupyihjgjljlveipmxrvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253706.6687706-111-13756795649094/AnsiballZ_command.py'
Sep 30 17:35:06 compute-1 sudo[69957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:07 compute-1 python3.9[69959]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:35:07 compute-1 sudo[69957]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:07 compute-1 sudo[70112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-firhusxmesxrkqdqovxmyvekkxqdwrns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253707.3739088-127-158964662110807/AnsiballZ_file.py'
Sep 30 17:35:07 compute-1 sudo[70112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:08 compute-1 python3.9[70114]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:35:08 compute-1 sudo[70112]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:08 compute-1 sshd-session[69039]: Connection closed by 192.168.122.30 port 58072
Sep 30 17:35:08 compute-1 sshd-session[69036]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:35:08 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Sep 30 17:35:08 compute-1 systemd[1]: session-18.scope: Consumed 4.901s CPU time.
Sep 30 17:35:08 compute-1 systemd-logind[789]: Session 18 logged out. Waiting for processes to exit.
Sep 30 17:35:08 compute-1 systemd-logind[789]: Removed session 18.
Sep 30 17:35:13 compute-1 sshd-session[70140]: Accepted publickey for zuul from 192.168.122.30 port 50254 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:35:13 compute-1 systemd-logind[789]: New session 19 of user zuul.
Sep 30 17:35:13 compute-1 systemd[1]: Started Session 19 of User zuul.
Sep 30 17:35:13 compute-1 sshd-session[70140]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:35:14 compute-1 python3.9[70293]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:35:15 compute-1 sudo[70447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqjopltonouwcaocccvnrmtcprhwlgyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253715.4825165-49-211065959313238/AnsiballZ_setup.py'
Sep 30 17:35:15 compute-1 sudo[70447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:16 compute-1 python3.9[70449]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:35:16 compute-1 sudo[70447]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:16 compute-1 sudo[70531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amlrzpcffhicybdhyraoeknkfvgreyrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759253715.4825165-49-211065959313238/AnsiballZ_dnf.py'
Sep 30 17:35:16 compute-1 sudo[70531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:17 compute-1 python3.9[70533]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 17:35:18 compute-1 sudo[70531]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:19 compute-1 python3.9[70684]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:35:20 compute-1 python3.9[70835]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:35:21 compute-1 python3.9[70985]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:35:21 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:35:22 compute-1 python3.9[71136]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:35:22 compute-1 sshd-session[70143]: Connection closed by 192.168.122.30 port 50254
Sep 30 17:35:22 compute-1 sshd-session[70140]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:35:22 compute-1 systemd-logind[789]: Session 19 logged out. Waiting for processes to exit.
Sep 30 17:35:22 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Sep 30 17:35:22 compute-1 systemd[1]: session-19.scope: Consumed 6.462s CPU time.
Sep 30 17:35:22 compute-1 systemd-logind[789]: Removed session 19.
Sep 30 17:35:30 compute-1 sshd-session[71161]: Accepted publickey for zuul from 38.102.83.36 port 59220 ssh2: RSA SHA256:DFNImqpR4L6Frzap1o3GpslEX6xER8N06/GWUjaeSng
Sep 30 17:35:30 compute-1 systemd-logind[789]: New session 20 of user zuul.
Sep 30 17:35:30 compute-1 systemd[1]: Started Session 20 of User zuul.
Sep 30 17:35:30 compute-1 sshd-session[71161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:35:30 compute-1 sudo[71237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajjoznsfjkrqzyytipgozziafzpngea ; /usr/bin/python3'
Sep 30 17:35:30 compute-1 sudo[71237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:30 compute-1 useradd[71241]: new group: name=ceph-admin, GID=42478
Sep 30 17:35:30 compute-1 useradd[71241]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Sep 30 17:35:30 compute-1 sudo[71237]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:31 compute-1 sudo[71323]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltxfanlhihrnpvdrjuiiyfigqywumumw ; /usr/bin/python3'
Sep 30 17:35:31 compute-1 sudo[71323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:31 compute-1 sudo[71323]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:31 compute-1 sudo[71396]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qahrxmbrlgqcohxtkstjqskcgzbkpghp ; /usr/bin/python3'
Sep 30 17:35:31 compute-1 sudo[71396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:31 compute-1 sudo[71396]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:32 compute-1 sudo[71446]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlmecqtkzsebanuocoeirwoxifftryrv ; /usr/bin/python3'
Sep 30 17:35:32 compute-1 sudo[71446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:32 compute-1 sudo[71446]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:32 compute-1 sudo[71472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhsayooixwsjrwduphhtmdjofodjlilv ; /usr/bin/python3'
Sep 30 17:35:32 compute-1 sudo[71472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:32 compute-1 sudo[71472]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:32 compute-1 sudo[71498]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikqxgjrqzkvcktdaodltboqmxihcdtbw ; /usr/bin/python3'
Sep 30 17:35:32 compute-1 sudo[71498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:32 compute-1 sudo[71498]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:33 compute-1 sudo[71524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzbltpnkakoqzmaphtarcpqyququlvzu ; /usr/bin/python3'
Sep 30 17:35:33 compute-1 sudo[71524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:33 compute-1 sudo[71524]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:33 compute-1 sudo[71602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aetrtafeghcbvdsjrqkuuofaintkzryg ; /usr/bin/python3'
Sep 30 17:35:33 compute-1 sudo[71602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:33 compute-1 sudo[71602]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:34 compute-1 sudo[71675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkuyusrxleeudwzqooffwgjkoynegnpu ; /usr/bin/python3'
Sep 30 17:35:34 compute-1 sudo[71675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:34 compute-1 sudo[71675]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:34 compute-1 sudo[71777]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jozkeqyadqtuifnsovcrabjhcidrzniz ; /usr/bin/python3'
Sep 30 17:35:34 compute-1 sudo[71777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:34 compute-1 sudo[71777]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:34 compute-1 sudo[71850]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywnxjqdhsrlmfpfcpqzqbdqubgfkazwo ; /usr/bin/python3'
Sep 30 17:35:34 compute-1 sudo[71850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:35 compute-1 sudo[71850]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:35 compute-1 sudo[71900]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwvioziodnxgqeiiuzhpxwjvklwvfhvg ; /usr/bin/python3'
Sep 30 17:35:35 compute-1 sudo[71900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:35 compute-1 python3[71902]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:35:36 compute-1 sudo[71900]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:37 compute-1 sudo[71995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuvnmzuqvxhvpyvkuqfgwwgfebltxpha ; /usr/bin/python3'
Sep 30 17:35:37 compute-1 sudo[71995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:37 compute-1 python3[71997]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 17:35:38 compute-1 sudo[71995]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:39 compute-1 sudo[72022]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbpeslslgjnmllxgadujbwpcjusexwsq ; /usr/bin/python3'
Sep 30 17:35:39 compute-1 sudo[72022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:39 compute-1 python3[72024]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:35:39 compute-1 sudo[72022]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:39 compute-1 sudo[72048]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqaxbgmdouogjdvdashjfbkymujdvqow ; /usr/bin/python3'
Sep 30 17:35:39 compute-1 sudo[72048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:39 compute-1 python3[72050]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:35:39 compute-1 kernel: loop: module loaded
Sep 30 17:35:39 compute-1 kernel: loop3: detected capacity change from 0 to 41943040
Sep 30 17:35:39 compute-1 sudo[72048]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:39 compute-1 sudo[72083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbddaanadjywhkjblejnnzbxxqhehrii ; /usr/bin/python3'
Sep 30 17:35:39 compute-1 sudo[72083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:40 compute-1 python3[72085]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:35:40 compute-1 lvm[72088]: PV /dev/loop3 not used.
Sep 30 17:35:40 compute-1 lvm[72090]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:35:40 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Sep 30 17:35:40 compute-1 lvm[72096]:   1 logical volume(s) in volume group "ceph_vg0" now active
Sep 30 17:35:40 compute-1 lvm[72100]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:35:40 compute-1 lvm[72100]: VG ceph_vg0 finished
Sep 30 17:35:40 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Sep 30 17:35:40 compute-1 sudo[72083]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:40 compute-1 sudo[72176]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohyitxadjkyssugdgayahhvgpwgiyrxu ; /usr/bin/python3'
Sep 30 17:35:40 compute-1 sudo[72176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:40 compute-1 python3[72178]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 17:35:40 compute-1 sudo[72176]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:41 compute-1 sudo[72249]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isqblogpsgnnbwvhlkvlzmzfrrekrdqj ; /usr/bin/python3'
Sep 30 17:35:41 compute-1 sudo[72249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:41 compute-1 python3[72251]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759253740.544695-33297-252439642391303/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:35:41 compute-1 sudo[72249]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:41 compute-1 sudo[72299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmsxflmarqoqjzkilsuhpovdujwpkbdw ; /usr/bin/python3'
Sep 30 17:35:41 compute-1 sudo[72299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:35:42 compute-1 python3[72301]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:35:42 compute-1 systemd[1]: Reloading.
Sep 30 17:35:42 compute-1 systemd-rc-local-generator[72332]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:35:42 compute-1 systemd-sysv-generator[72335]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:35:42 compute-1 systemd[1]: Starting Ceph OSD losetup...
Sep 30 17:35:42 compute-1 bash[72342]: /dev/loop3: [64513]:4327953 (/var/lib/ceph-osd-0.img)
Sep 30 17:35:42 compute-1 systemd[1]: Finished Ceph OSD losetup.
Sep 30 17:35:42 compute-1 sudo[72299]: pam_unix(sudo:session): session closed for user root
Sep 30 17:35:42 compute-1 lvm[72343]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:35:42 compute-1 lvm[72343]: VG ceph_vg0 finished
Sep 30 17:35:44 compute-1 python3[72367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:36:24 compute-1 PackageKit[31697]: daemon quit
Sep 30 17:36:24 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 17:36:56 compute-1 sshd-session[72414]: Connection closed by authenticating user root 194.0.234.19 port 63154 [preauth]
Sep 30 17:37:09 compute-1 sshd-session[72412]: Connection closed by 162.142.125.47 port 43510 [preauth]
Sep 30 17:37:28 compute-1 sshd-session[72416]: Accepted publickey for ceph-admin from 192.168.122.100 port 43640 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:28 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Sep 30 17:37:28 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Sep 30 17:37:28 compute-1 systemd-logind[789]: New session 21 of user ceph-admin.
Sep 30 17:37:28 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Sep 30 17:37:28 compute-1 systemd[1]: Starting User Manager for UID 42477...
Sep 30 17:37:28 compute-1 systemd[72420]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:28 compute-1 sshd-session[72429]: Accepted publickey for ceph-admin from 192.168.122.100 port 43650 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:28 compute-1 systemd[72420]: Queued start job for default target Main User Target.
Sep 30 17:37:28 compute-1 systemd-logind[789]: New session 23 of user ceph-admin.
Sep 30 17:37:28 compute-1 systemd[72420]: Created slice User Application Slice.
Sep 30 17:37:28 compute-1 systemd[72420]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 17:37:28 compute-1 systemd[72420]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 17:37:28 compute-1 systemd[72420]: Reached target Paths.
Sep 30 17:37:28 compute-1 systemd[72420]: Reached target Timers.
Sep 30 17:37:28 compute-1 systemd[72420]: Starting D-Bus User Message Bus Socket...
Sep 30 17:37:28 compute-1 systemd[72420]: Starting Create User's Volatile Files and Directories...
Sep 30 17:37:28 compute-1 systemd[72420]: Listening on D-Bus User Message Bus Socket.
Sep 30 17:37:28 compute-1 systemd[72420]: Reached target Sockets.
Sep 30 17:37:28 compute-1 systemd[72420]: Finished Create User's Volatile Files and Directories.
Sep 30 17:37:28 compute-1 systemd[72420]: Reached target Basic System.
Sep 30 17:37:28 compute-1 systemd[72420]: Reached target Main User Target.
Sep 30 17:37:28 compute-1 systemd[72420]: Startup finished in 157ms.
Sep 30 17:37:28 compute-1 systemd[1]: Started User Manager for UID 42477.
Sep 30 17:37:28 compute-1 systemd[1]: Started Session 21 of User ceph-admin.
Sep 30 17:37:28 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Sep 30 17:37:28 compute-1 sshd-session[72416]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:28 compute-1 sshd-session[72429]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:28 compute-1 sudo[72441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:28 compute-1 sudo[72441]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:28 compute-1 sudo[72441]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:29 compute-1 sshd-session[72466]: Accepted publickey for ceph-admin from 192.168.122.100 port 43664 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:29 compute-1 systemd-logind[789]: New session 24 of user ceph-admin.
Sep 30 17:37:29 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Sep 30 17:37:29 compute-1 sshd-session[72466]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:29 compute-1 sudo[72470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Sep 30 17:37:29 compute-1 sudo[72470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:29 compute-1 sudo[72470]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:29 compute-1 sshd-session[72495]: Accepted publickey for ceph-admin from 192.168.122.100 port 43678 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:29 compute-1 systemd-logind[789]: New session 25 of user ceph-admin.
Sep 30 17:37:29 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Sep 30 17:37:29 compute-1 sshd-session[72495]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:29 compute-1 sudo[72499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Sep 30 17:37:29 compute-1 sudo[72499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:29 compute-1 sudo[72499]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:29 compute-1 sshd-session[72524]: Accepted publickey for ceph-admin from 192.168.122.100 port 43686 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:29 compute-1 systemd-logind[789]: New session 26 of user ceph-admin.
Sep 30 17:37:29 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Sep 30 17:37:29 compute-1 sshd-session[72524]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:29 compute-1 sudo[72528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:29 compute-1 sudo[72528]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:29 compute-1 sudo[72528]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:30 compute-1 sshd-session[72553]: Accepted publickey for ceph-admin from 192.168.122.100 port 43702 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:30 compute-1 systemd-logind[789]: New session 27 of user ceph-admin.
Sep 30 17:37:30 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Sep 30 17:37:30 compute-1 sshd-session[72553]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:30 compute-1 sudo[72557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:30 compute-1 sudo[72557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:30 compute-1 sudo[72557]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:30 compute-1 sshd-session[72582]: Accepted publickey for ceph-admin from 192.168.122.100 port 43714 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:30 compute-1 systemd-logind[789]: New session 28 of user ceph-admin.
Sep 30 17:37:30 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Sep 30 17:37:30 compute-1 sshd-session[72582]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:30 compute-1 sudo[72586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Sep 30 17:37:30 compute-1 sudo[72586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:30 compute-1 sudo[72586]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:30 compute-1 sshd-session[72611]: Accepted publickey for ceph-admin from 192.168.122.100 port 43716 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:30 compute-1 systemd-logind[789]: New session 29 of user ceph-admin.
Sep 30 17:37:30 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Sep 30 17:37:30 compute-1 sshd-session[72611]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:30 compute-1 sudo[72615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:30 compute-1 sudo[72615]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:31 compute-1 sudo[72615]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:31 compute-1 sshd-session[72640]: Accepted publickey for ceph-admin from 192.168.122.100 port 43722 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:31 compute-1 systemd-logind[789]: New session 30 of user ceph-admin.
Sep 30 17:37:31 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Sep 30 17:37:31 compute-1 sshd-session[72640]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:31 compute-1 sudo[72644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Sep 30 17:37:31 compute-1 sudo[72644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:31 compute-1 sudo[72644]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:31 compute-1 sshd-session[72669]: Accepted publickey for ceph-admin from 192.168.122.100 port 43728 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:31 compute-1 systemd-logind[789]: New session 31 of user ceph-admin.
Sep 30 17:37:31 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Sep 30 17:37:31 compute-1 sshd-session[72669]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:32 compute-1 sshd-session[72696]: Accepted publickey for ceph-admin from 192.168.122.100 port 43740 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:32 compute-1 systemd-logind[789]: New session 32 of user ceph-admin.
Sep 30 17:37:32 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Sep 30 17:37:32 compute-1 sshd-session[72696]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:32 compute-1 sudo[72700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Sep 30 17:37:32 compute-1 sudo[72700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:32 compute-1 sudo[72700]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:33 compute-1 sshd-session[72725]: Accepted publickey for ceph-admin from 192.168.122.100 port 43748 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:37:33 compute-1 systemd-logind[789]: New session 33 of user ceph-admin.
Sep 30 17:37:33 compute-1 systemd[1]: Started Session 33 of User ceph-admin.
Sep 30 17:37:33 compute-1 sshd-session[72725]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:37:33 compute-1 sudo[72729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Sep 30 17:37:33 compute-1 sudo[72729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:33 compute-1 sudo[72729]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:33 compute-1 sudo[72774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:33 compute-1 sudo[72774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:33 compute-1 sudo[72774]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:33 compute-1 sudo[72799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 17:37:33 compute-1 sudo[72799]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:34 compute-1 sudo[72799]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:34 compute-1 sudo[72843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:34 compute-1 sudo[72843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:34 compute-1 sudo[72843]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:34 compute-1 sudo[72868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:37:34 compute-1 sudo[72868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:34 compute-1 sudo[72868]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:34 compute-1 sudo[72929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:34 compute-1 sudo[72929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:34 compute-1 sudo[72929]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:34 compute-1 sudo[72954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:37:34 compute-1 sudo[72954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:34 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 72992 (sysctl)
Sep 30 17:37:34 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Sep 30 17:37:34 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Sep 30 17:37:35 compute-1 sudo[72954]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:35 compute-1 sudo[73014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:35 compute-1 sudo[73014]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:35 compute-1 sudo[73014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:35 compute-1 sudo[73039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 17:37:35 compute-1 sudo[73039]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:35 compute-1 sudo[73039]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:35 compute-1 sudo[73082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:35 compute-1 sudo[73082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:35 compute-1 sudo[73082]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:35 compute-1 sudo[73107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- inventory --format=json-pretty --filter-for-batch
Sep 30 17:37:35 compute-1 sudo[73107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1358844494-lower\x2dmapped.mount: Deactivated successfully.
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.784828056 +0000 UTC m=+16.470360045 container create bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:37:52 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Sep 30 17:37:52 compute-1 systemd[1]: Started libpod-conmon-bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436.scope.
Sep 30 17:37:52 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.760229377 +0000 UTC m=+16.445761346 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.876584819 +0000 UTC m=+16.562116848 container init bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.883442505 +0000 UTC m=+16.568974464 container start bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.887052024 +0000 UTC m=+16.572584023 container attach bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:37:52 compute-1 gifted_poincare[73227]: 167 167
Sep 30 17:37:52 compute-1 systemd[1]: libpod-bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436.scope: Deactivated successfully.
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.88985365 +0000 UTC m=+16.575385609 container died bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 17:37:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-01e1bf15bdb888a1ef6b50893612a3f88135961dc97aa71d23daf178fc381807-merged.mount: Deactivated successfully.
Sep 30 17:37:52 compute-1 podman[73168]: 2025-09-30 17:37:52.939684444 +0000 UTC m=+16.625216433 container remove bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gifted_poincare, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:37:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:52 compute-1 systemd[1]: libpod-conmon-bce593d265e079d57cb9c3b3fb1d5bc01074b04639455665d0c1649dcfc84436.scope: Deactivated successfully.
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.132610326 +0000 UTC m=+0.052407875 container create 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Sep 30 17:37:53 compute-1 systemd[1]: Started libpod-conmon-5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f.scope.
Sep 30 17:37:53 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:37:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7353647321fe97420fcbd75b83e0d3ac4f3adda9033de78b180c2eb52d27d3ae/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7353647321fe97420fcbd75b83e0d3ac4f3adda9033de78b180c2eb52d27d3ae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.199211586 +0000 UTC m=+0.119009115 container init 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.105856039 +0000 UTC m=+0.025653648 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.206397831 +0000 UTC m=+0.126195340 container start 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.209921657 +0000 UTC m=+0.129719166 container attach 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]: [
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:     {
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "available": false,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "being_replaced": false,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "ceph_device_lvm": false,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "device_id": "QEMU_DVD-ROM_QM00001",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "lsm_data": {},
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "lvs": [],
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "path": "/dev/sr0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "rejected_reasons": [
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "Has a FileSystem",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "Insufficient space (<5GB)"
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         ],
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         "sys_api": {
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "actuators": null,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "device_nodes": [
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:                 "sr0"
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             ],
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "devname": "sr0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "human_readable_size": "482.00 KB",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "id_bus": "ata",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "model": "QEMU DVD-ROM",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "nr_requests": "2",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "parent": "/dev/sr0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "partitions": {},
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "path": "/dev/sr0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "removable": "1",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "rev": "2.5+",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "ro": "0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "rotational": "0",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "sas_address": "",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "sas_device_handle": "",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "scheduler_mode": "mq-deadline",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "sectors": 0,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "sectorsize": "2048",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "size": 493568.0,
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "support_discard": "2048",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "type": "disk",
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:             "vendor": "QEMU"
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:         }
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]:     }
Sep 30 17:37:53 compute-1 eager_ishizaka[73266]: ]
Sep 30 17:37:53 compute-1 systemd[1]: libpod-5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f.scope: Deactivated successfully.
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.881783003 +0000 UTC m=+0.801580522 container died 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:37:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-7353647321fe97420fcbd75b83e0d3ac4f3adda9033de78b180c2eb52d27d3ae-merged.mount: Deactivated successfully.
Sep 30 17:37:53 compute-1 podman[73250]: 2025-09-30 17:37:53.924590676 +0000 UTC m=+0.844388185 container remove 5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_ishizaka, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Sep 30 17:37:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:53 compute-1 systemd[1]: libpod-conmon-5450e101054e110ac63abd4167b7c019fff3955f0703b3aa2c54742c7084a89f.scope: Deactivated successfully.
Sep 30 17:37:53 compute-1 sudo[73107]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:37:54 compute-1 sudo[74152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74152]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:37:54 compute-1 sudo[74177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74177]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74202]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:54 compute-1 sudo[74227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74227]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74252]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74300]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74325]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74325]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Sep 30 17:37:54 compute-1 sudo[74350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74350]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:37:54 compute-1 sudo[74375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74375]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:37:54 compute-1 sudo[74400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74400]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74425]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:54 compute-1 sudo[74450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74450]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:54 compute-1 sudo[74475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:37:54 compute-1 sudo[74475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:54 compute-1 sudo[74475]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:37:55 compute-1 sudo[74523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74523]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:37:55 compute-1 sudo[74548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74548]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:37:55 compute-1 sudo[74573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74573]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:37:55 compute-1 sudo[74598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74598]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:37:55 compute-1 sudo[74623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74623]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74648]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:55 compute-1 sudo[74673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74673]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74698]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74746]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74771]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Sep 30 17:37:55 compute-1 sudo[74796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74796]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:37:55 compute-1 sudo[74821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74821]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:37:55 compute-1 sudo[74846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74846]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74871]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:55 compute-1 sudo[74896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74896]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:55 compute-1 sudo[74921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:37:55 compute-1 sudo[74921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:55 compute-1 sudo[74921]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:56 compute-1 sudo[74969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:37:56 compute-1 sudo[74969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:56 compute-1 sudo[74969]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:56 compute-1 sudo[74994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:37:56 compute-1 sudo[74994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:56 compute-1 sudo[74994]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:56 compute-1 sudo[75019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:37:56 compute-1 sudo[75019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:56 compute-1 sudo[75019]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:56 compute-1 sudo[75044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:56 compute-1 sudo[75044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:56 compute-1 sudo[75044]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:56 compute-1 sudo[75069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:56 compute-1 sudo[75069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:56 compute-1 podman[75134]: 2025-09-30 17:37:56.746985626 +0000 UTC m=+0.029181424 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:37:56 compute-1 podman[75134]: 2025-09-30 17:37:56.894376131 +0000 UTC m=+0.176571899 container create 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Sep 30 17:37:57 compute-1 systemd[1]: Started libpod-conmon-80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7.scope.
Sep 30 17:37:57 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:37:57 compute-1 podman[75134]: 2025-09-30 17:37:57.168785547 +0000 UTC m=+0.450981355 container init 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:37:57 compute-1 podman[75134]: 2025-09-30 17:37:57.182100289 +0000 UTC m=+0.464296097 container start 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Sep 30 17:37:57 compute-1 sshd-session[75146]: Invalid user solana from 45.148.10.240 port 39900
Sep 30 17:37:57 compute-1 systemd[1]: libpod-80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7.scope: Deactivated successfully.
Sep 30 17:37:57 compute-1 hardcore_davinci[75150]: 167 167
Sep 30 17:37:57 compute-1 conmon[75150]: conmon 80195ab75726cda7bdb9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7.scope/container/memory.events
Sep 30 17:37:57 compute-1 sshd-session[75146]: Connection closed by invalid user solana 45.148.10.240 port 39900 [preauth]
Sep 30 17:37:57 compute-1 podman[75134]: 2025-09-30 17:37:57.313145548 +0000 UTC m=+0.595341346 container attach 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:37:57 compute-1 podman[75134]: 2025-09-30 17:37:57.314004892 +0000 UTC m=+0.596200660 container died 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Sep 30 17:37:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-9762adda3c15ec75815a8c022b24209138b7fb86bc6954fa2cd8306d571aa825-merged.mount: Deactivated successfully.
Sep 30 17:37:57 compute-1 podman[75134]: 2025-09-30 17:37:57.646942468 +0000 UTC m=+0.929138236 container remove 80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_davinci, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Sep 30 17:37:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:57 compute-1 systemd[1]: libpod-conmon-80195ab75726cda7bdb97ce0b3ef055c3376d6aed8fe52f12b605a24909779e7.scope: Deactivated successfully.
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.718514597 +0000 UTC m=+0.044004349 container create 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:37:57 compute-1 systemd[1]: Started libpod-conmon-581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c.scope.
Sep 30 17:37:57 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:37:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e924d9a8b800b2abd8c4d94bba4f04c2efba899258d1f79b0df76709154e020/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e924d9a8b800b2abd8c4d94bba4f04c2efba899258d1f79b0df76709154e020/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e924d9a8b800b2abd8c4d94bba4f04c2efba899258d1f79b0df76709154e020/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e924d9a8b800b2abd8c4d94bba4f04c2efba899258d1f79b0df76709154e020/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.791113174 +0000 UTC m=+0.116579925 container init 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.696394145 +0000 UTC m=+0.021860946 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.804735895 +0000 UTC m=+0.130202646 container start 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.80857126 +0000 UTC m=+0.134038011 container attach 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Sep 30 17:37:57 compute-1 systemd[1]: libpod-581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c.scope: Deactivated successfully.
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.903246938 +0000 UTC m=+0.228713699 container died 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Sep 30 17:37:57 compute-1 podman[75172]: 2025-09-30 17:37:57.945954891 +0000 UTC m=+0.271421652 container remove 581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_hoover, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Sep 30 17:37:57 compute-1 systemd[1]: libpod-conmon-581d338c67f6fd89d23dd26fd998ea41658ddf7781240efafce677f0d70e981c.scope: Deactivated successfully.
Sep 30 17:37:58 compute-1 systemd[1]: Reloading.
Sep 30 17:37:58 compute-1 systemd-rc-local-generator[75251]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:37:58 compute-1 systemd-sysv-generator[75256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:37:58 compute-1 systemd[1]: Reloading.
Sep 30 17:37:58 compute-1 systemd-sysv-generator[75290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:37:58 compute-1 systemd-rc-local-generator[75287]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:37:58 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Sep 30 17:37:58 compute-1 systemd[1]: Reloading.
Sep 30 17:37:58 compute-1 systemd-rc-local-generator[75323]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:37:58 compute-1 systemd-sysv-generator[75329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:37:58 compute-1 systemd[1]: Reached target Ceph cluster 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:37:58 compute-1 systemd[1]: Reloading.
Sep 30 17:37:58 compute-1 systemd-sysv-generator[75368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:37:58 compute-1 systemd-rc-local-generator[75365]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:37:59 compute-1 systemd[1]: Reloading.
Sep 30 17:37:59 compute-1 systemd-sysv-generator[75408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:37:59 compute-1 systemd-rc-local-generator[75403]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:37:59 compute-1 systemd[1]: Created slice Slice /system/ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:37:59 compute-1 systemd[1]: Reached target System Time Set.
Sep 30 17:37:59 compute-1 systemd[1]: Reached target System Time Synchronized.
Sep 30 17:37:59 compute-1 systemd[1]: Starting Ceph mon.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:37:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 17:37:59 compute-1 podman[75464]: 2025-09-30 17:37:59.524693964 +0000 UTC m=+0.051788672 container create 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:37:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc1c5fc7fb31ee15fc7833baf823f319f33b4d345773f227209dac899ecedcb5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc1c5fc7fb31ee15fc7833baf823f319f33b4d345773f227209dac899ecedcb5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc1c5fc7fb31ee15fc7833baf823f319f33b4d345773f227209dac899ecedcb5/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Sep 30 17:37:59 compute-1 podman[75464]: 2025-09-30 17:37:59.583842805 +0000 UTC m=+0.110937503 container init 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:37:59 compute-1 podman[75464]: 2025-09-30 17:37:59.494697357 +0000 UTC m=+0.021792105 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:37:59 compute-1 podman[75464]: 2025-09-30 17:37:59.591532284 +0000 UTC m=+0.118626962 container start 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:37:59 compute-1 bash[75464]: 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73
Sep 30 17:37:59 compute-1 systemd[1]: Started Ceph mon.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:37:59 compute-1 ceph-mon[75484]: set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:37:59 compute-1 ceph-mon[75484]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pidfile_write: ignore empty --pid-file
Sep 30 17:37:59 compute-1 ceph-mon[75484]: load: jerasure load: lrc 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: RocksDB version: 7.9.2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Git sha 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Compile date 2025-07-17 03:12:14
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: DB SUMMARY
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: DB Session ID:  EOVNASF3CCDBD5EL5S5F
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: CURRENT file:  CURRENT
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: IDENTITY file:  IDENTITY
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                         Options.error_if_exists: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.create_if_missing: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                         Options.paranoid_checks: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.flush_verify_memtable_count: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                                     Options.env: 0x55f2a98cac20
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                                      Options.fs: PosixFileSystem
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                                Options.info_log: 0x55f2aa1e7a20
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.max_file_opening_threads: 16
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                              Options.statistics: (nil)
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                               Options.use_fsync: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.max_log_file_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.log_file_time_to_roll: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.keep_log_file_num: 1000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                    Options.recycle_log_file_num: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                         Options.allow_fallocate: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.allow_mmap_reads: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.allow_mmap_writes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.use_direct_reads: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.create_missing_column_families: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                              Options.db_log_dir: 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                                 Options.wal_dir: 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.table_cache_numshardbits: 6
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                         Options.WAL_ttl_seconds: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.WAL_size_limit_MB: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.manifest_preallocation_size: 4194304
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                     Options.is_fd_close_on_exec: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.advise_random_on_open: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                    Options.db_write_buffer_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                    Options.write_buffer_manager: 0x55f2aa1eb900
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.access_hint_on_compaction_start: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                      Options.use_adaptive_mutex: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                            Options.rate_limiter: (nil)
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.wal_recovery_mode: 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.enable_thread_tracking: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.enable_pipelined_write: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.unordered_write: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.write_thread_max_yield_usec: 100
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                               Options.row_cache: None
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                              Options.wal_filter: None
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.avoid_flush_during_recovery: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.allow_ingest_behind: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.two_write_queues: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.manual_wal_flush: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.wal_compression: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.atomic_flush: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.persist_stats_to_disk: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.write_dbid_to_manifest: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.log_readahead_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.best_efforts_recovery: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.allow_data_in_errors: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.db_host_id: __hostname__
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.enforce_single_del_contracts: true
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.max_background_jobs: 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.max_background_compactions: -1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.max_subcompactions: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.delayed_write_rate : 16777216
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.max_total_wal_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.stats_dump_period_sec: 600
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.stats_persist_period_sec: 600
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                          Options.max_open_files: -1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                          Options.bytes_per_sync: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                      Options.wal_bytes_per_sync: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.strict_bytes_per_sync: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:       Options.compaction_readahead_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.max_background_flushes: -1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Compression algorithms supported:
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kZSTD supported: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kXpressCompression supported: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kBZip2Compression supported: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kZSTDNotFinalCompression supported: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kLZ4Compression supported: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kZlibCompression supported: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kLZ4HCCompression supported: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         kSnappyCompression supported: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Fast CRC32 supported: Supported on x86
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: DMutex implementation: pthread_mutex_t
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:           Options.merge_operator: 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:        Options.compaction_filter: None
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f2aa1e65c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55f2aa20b350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:        Options.write_buffer_size: 33554432
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:  Options.max_write_buffer_number: 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.compression: NoCompression
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.num_levels: 7
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:37:59 compute-1 sudo[75069]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2bf561c2-71cd-475c-b1c0-9f13ad2b054d
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253879641196, "job": 1, "event": "recovery_started", "wal_files": [4]}
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253879643376, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253879643502, "job": 1, "event": "recovery_finished"}
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55f2aa20ce00
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: DB pointer 0x55f2aa316000
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 17:37:59 compute-1 ceph-mon[75484]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Sep 30 17:37:59 compute-1 ceph-mon[75484]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(???) e0 preinit fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).mds e1 new map
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2025-09-30T17:36:27:095553+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e3 crush map has features 3314932999778484224, adjusting msgr requires
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e3 crush map has features 288514050185494528, adjusting msgr requires
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e3 crush map has features 288514050185494528, adjusting msgr requires
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).osd e3 crush map has features 288514050185494528, adjusting msgr requires
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pgmap v19: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.conf
Sep 30 17:37:59 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:37:59 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:37:59 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:37:59 compute-1 ceph-mon[75484]: Deploying daemon mon.compute-1 on compute-1
Sep 30 17:37:59 compute-1 ceph-mon[75484]: pgmap v22: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:37:59 compute-1 ceph-mon[75484]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..4) refresh upgraded, format 0 -> 3
Sep 30 17:37:59 compute-1 sudo[75523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:37:59 compute-1 sudo[75523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:37:59 compute-1 sudo[75523]: pam_unix(sudo:session): session closed for user root
Sep 30 17:37:59 compute-1 sudo[75548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:37:59 compute-1 sudo[75548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.246958873 +0000 UTC m=+0.041495661 container create 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:00 compute-1 systemd[1]: Started libpod-conmon-8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf.scope.
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.226842535 +0000 UTC m=+0.021379323 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:00 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.352696303 +0000 UTC m=+0.147233111 container init 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.360529936 +0000 UTC m=+0.155066734 container start 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.364114314 +0000 UTC m=+0.158651112 container attach 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:00 compute-1 zealous_mahavira[75631]: 167 167
Sep 30 17:38:00 compute-1 systemd[1]: libpod-8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf.scope: Deactivated successfully.
Sep 30 17:38:00 compute-1 conmon[75631]: conmon 8ebce74a01f5cff3c58b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf.scope/container/memory.events
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.368144533 +0000 UTC m=+0.162681331 container died 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Sep 30 17:38:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-aa215718e955520a403bc0ed3bdd34314a3a04bb98c27777dbecbf3965e781d6-merged.mount: Deactivated successfully.
Sep 30 17:38:00 compute-1 podman[75615]: 2025-09-30 17:38:00.418972928 +0000 UTC m=+0.213509746 container remove 8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_mahavira, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Sep 30 17:38:00 compute-1 systemd[1]: libpod-conmon-8ebce74a01f5cff3c58b5c8a02062a01ef37c08d2851115ad2aa763b0af8c4bf.scope: Deactivated successfully.
Sep 30 17:38:00 compute-1 systemd[1]: Reloading.
Sep 30 17:38:00 compute-1 systemd-rc-local-generator[75675]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:00 compute-1 systemd-sysv-generator[75678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:00 compute-1 systemd[1]: Reloading.
Sep 30 17:38:00 compute-1 systemd-rc-local-generator[75713]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:00 compute-1 systemd-sysv-generator[75717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:00 compute-1 systemd[1]: Starting Ceph mgr.compute-1.glbusf for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:38:01 compute-1 podman[75773]: 2025-09-30 17:38:01.170387741 +0000 UTC m=+0.034031798 container create 0d9fb24a4b030712eaef41d627044b1658942b7411e43b450530d6d0029c188b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Sep 30 17:38:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac50d8f6ddef3f66d93931997277bd25e4154a8bc43e9d82945099794499d20/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac50d8f6ddef3f66d93931997277bd25e4154a8bc43e9d82945099794499d20/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac50d8f6ddef3f66d93931997277bd25e4154a8bc43e9d82945099794499d20/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac50d8f6ddef3f66d93931997277bd25e4154a8bc43e9d82945099794499d20/merged/var/lib/ceph/mgr/ceph-compute-1.glbusf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:01 compute-1 podman[75773]: 2025-09-30 17:38:01.243603735 +0000 UTC m=+0.107247882 container init 0d9fb24a4b030712eaef41d627044b1658942b7411e43b450530d6d0029c188b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:01 compute-1 podman[75773]: 2025-09-30 17:38:01.154975371 +0000 UTC m=+0.018619448 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:01 compute-1 podman[75773]: 2025-09-30 17:38:01.255749055 +0000 UTC m=+0.119393162 container start 0d9fb24a4b030712eaef41d627044b1658942b7411e43b450530d6d0029c188b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:01 compute-1 bash[75773]: 0d9fb24a4b030712eaef41d627044b1658942b7411e43b450530d6d0029c188b
Sep 30 17:38:01 compute-1 systemd[1]: Started Ceph mgr.compute-1.glbusf for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:38:01 compute-1 sudo[75548]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:01 compute-1 ceph-mon[75484]: mon.compute-1@-1(probing) e2  my rank is now 1 (was -1)
Sep 30 17:38:01 compute-1 ceph-mon[75484]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Sep 30 17:38:01 compute-1 ceph-mon[75484]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Sep 30 17:38:01 compute-1 ceph-mon[75484]: mon.compute-1@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025,kernel_version=5.14.0-617.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864116,os=Linux}
Sep 30 17:38:04 compute-1 ceph-mgr[75792]: set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:38:04 compute-1 ceph-mgr[75792]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Sep 30 17:38:04 compute-1 ceph-mgr[75792]: pidfile_write: ignore empty --pid-file
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_auth_request failed to assign global_id
Sep 30 17:38:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'alerts'
Sep 30 17:38:04 compute-1 ceph-mon[75484]: Deploying daemon mgr.compute-1.glbusf on compute-1
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-0 calling monitor election
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: pgmap v24: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-1 calling monitor election
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mon.compute-0 is new leader, mons compute-0,compute-1 in quorum (ranks 0,1)
Sep 30 17:38:04 compute-1 ceph-mon[75484]: monmap epoch 2
Sep 30 17:38:04 compute-1 ceph-mon[75484]: fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:04 compute-1 ceph-mon[75484]: last_changed 2025-09-30T17:37:59.709954+0000
Sep 30 17:38:04 compute-1 ceph-mon[75484]: created 2025-09-30T17:36:25.121133+0000
Sep 30 17:38:04 compute-1 ceph-mon[75484]: min_mon_release 19 (squid)
Sep 30 17:38:04 compute-1 ceph-mon[75484]: election_strategy: 1
Sep 30 17:38:04 compute-1 ceph-mon[75484]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Sep 30 17:38:04 compute-1 ceph-mon[75484]: 1: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Sep 30 17:38:04 compute-1 ceph-mon[75484]: fsmap 
Sep 30 17:38:04 compute-1 ceph-mon[75484]: osdmap e3: 0 total, 0 up, 0 in
Sep 30 17:38:04 compute-1 ceph-mon[75484]: mgrmap e7: compute-0.efvthf(active, since 64s)
Sep 30 17:38:04 compute-1 ceph-mon[75484]: Health detail: HEALTH_WARN OSD count 0 < osd_pool_default_size 1
Sep 30 17:38:04 compute-1 ceph-mon[75484]: [WRN] TOO_FEW_OSDS: OSD count 0 < osd_pool_default_size 1
Sep 30 17:38:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'balancer'
Sep 30 17:38:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:05.024+0000 7fa303278140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 sudo[75813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:05 compute-1 sudo[75813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:05 compute-1 sudo[75813]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'cephadm'
Sep 30 17:38:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:05.109+0000 7fa303278140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 sudo[75838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:05 compute-1 sudo[75838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.524959986 +0000 UTC m=+0.044642887 container create 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Sep 30 17:38:05 compute-1 systemd[1]: Started libpod-conmon-139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4.scope.
Sep 30 17:38:05 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.505171237 +0000 UTC m=+0.024854168 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.60845638 +0000 UTC m=+0.128139291 container init 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.61689773 +0000 UTC m=+0.136580621 container start 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.620301893 +0000 UTC m=+0.139984784 container attach 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Sep 30 17:38:05 compute-1 jolly_chatelet[75930]: 167 167
Sep 30 17:38:05 compute-1 systemd[1]: libpod-139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4.scope: Deactivated successfully.
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.622813501 +0000 UTC m=+0.142496432 container died 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Sep 30 17:38:05 compute-1 systemd[1]: var-lib-containers-storage-overlay-db509adcf0421a6b2b38bbf5ec4b656b2aaa17606925df124a4627e16bf403b8-merged.mount: Deactivated successfully.
Sep 30 17:38:05 compute-1 podman[75907]: 2025-09-30 17:38:05.669680417 +0000 UTC m=+0.189363328 container remove 139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_chatelet, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Sep 30 17:38:05 compute-1 systemd[1]: libpod-conmon-139adc637a38c690917da4ec6e99c4c087b4e3271bf39d57a8ca04aee90474d4.scope: Deactivated successfully.
Sep 30 17:38:05 compute-1 systemd[1]: Reloading.
Sep 30 17:38:05 compute-1 systemd-sysv-generator[75974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:05 compute-1 systemd-rc-local-generator[75971]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:05 compute-1 sshd-session[75977]: banner exchange: Connection from 185.180.140.107 port 45667: invalid format
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'crash'
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'dashboard'
Sep 30 17:38:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:05.947+0000 7fa303278140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:38:05 compute-1 ceph-mon[75484]: pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:05 compute-1 ceph-mon[75484]: Deploying daemon crash.compute-1 on compute-1
Sep 30 17:38:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:06 compute-1 systemd[1]: Reloading.
Sep 30 17:38:06 compute-1 systemd-rc-local-generator[76008]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:06 compute-1 systemd-sysv-generator[76016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:06 compute-1 systemd[1]: Starting Ceph crash.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:38:06 compute-1 podman[76070]: 2025-09-30 17:38:06.552940121 +0000 UTC m=+0.043358732 container create 488cd31837bd6bbf7af520b840db59287665269a6742d19a2c4d3a53e9a0a070 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Sep 30 17:38:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0cc205ee8117a01d68d669af8fa8f8213b7b700267a5d79a0e09ef7693cbb2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0cc205ee8117a01d68d669af8fa8f8213b7b700267a5d79a0e09ef7693cbb2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0cc205ee8117a01d68d669af8fa8f8213b7b700267a5d79a0e09ef7693cbb2/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d0cc205ee8117a01d68d669af8fa8f8213b7b700267a5d79a0e09ef7693cbb2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'devicehealth'
Sep 30 17:38:06 compute-1 podman[76070]: 2025-09-30 17:38:06.626039622 +0000 UTC m=+0.116458273 container init 488cd31837bd6bbf7af520b840db59287665269a6742d19a2c4d3a53e9a0a070 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Sep 30 17:38:06 compute-1 podman[76070]: 2025-09-30 17:38:06.533783089 +0000 UTC m=+0.024201720 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:06 compute-1 podman[76070]: 2025-09-30 17:38:06.632389575 +0000 UTC m=+0.122808206 container start 488cd31837bd6bbf7af520b840db59287665269a6742d19a2c4d3a53e9a0a070 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Sep 30 17:38:06 compute-1 bash[76070]: 488cd31837bd6bbf7af520b840db59287665269a6742d19a2c4d3a53e9a0a070
Sep 30 17:38:06 compute-1 systemd[1]: Started Ceph crash.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:06.686+0000 7fa303278140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'diskprediction_local'
Sep 30 17:38:06 compute-1 sudo[75838]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: INFO:ceph-crash:pinging cluster to exercise our key
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.796+0000 7f0187e76640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.796+0000 7f0187e76640 -1 AuthRegistry(0x7f0180069a20) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.797+0000 7f0187e76640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.797+0000 7f0187e76640 -1 AuthRegistry(0x7f0187e74ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.797+0000 7f01853ea640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.799+0000 7f0185beb640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: 2025-09-30T17:38:06.799+0000 7f0187e76640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: [errno 13] RADOS permission denied (error connecting to the cluster)
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]:   from numpy import show_config as show_numpy_config
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:06.877+0000 7fa303278140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'influx'
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:38:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'insights'
Sep 30 17:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:06.954+0000 7fa303278140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:38:07 compute-1 sudo[76102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:07 compute-1 sudo[76102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:07 compute-1 sudo[76102]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2881542026' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'iostat'
Sep 30 17:38:07 compute-1 sudo[76127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Sep 30 17:38:07 compute-1 sudo[76127]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'k8sevents'
Sep 30 17:38:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:07.128+0000 7fa303278140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.557376825 +0000 UTC m=+0.056729696 container create c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'localpool'
Sep 30 17:38:07 compute-1 systemd[1]: Started libpod-conmon-c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d.scope.
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.528432367 +0000 UTC m=+0.027785308 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mds_autoscaler'
Sep 30 17:38:07 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.676098788 +0000 UTC m=+0.175451639 container init c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.690376837 +0000 UTC m=+0.189729698 container start c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.694851989 +0000 UTC m=+0.194204840 container attach c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:38:07 compute-1 exciting_elgamal[76208]: 167 167
Sep 30 17:38:07 compute-1 systemd[1]: libpod-c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d.scope: Deactivated successfully.
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.700001089 +0000 UTC m=+0.199353930 container died c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1)
Sep 30 17:38:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-6dcb19393b294e1c1206ae010ffd2bf4340d00dd266a352614b2647d8c1de712-merged.mount: Deactivated successfully.
Sep 30 17:38:07 compute-1 podman[76191]: 2025-09-30 17:38:07.751413729 +0000 UTC m=+0.250766550 container remove c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=exciting_elgamal, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid)
Sep 30 17:38:07 compute-1 systemd[1]: libpod-conmon-c354efe7169594ab574ff4319ce60cb217be9f1b3cba74164ddeb3d12338a09d.scope: Deactivated successfully.
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mirroring'
Sep 30 17:38:07 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'nfs'
Sep 30 17:38:08 compute-1 podman[76233]: 2025-09-30 17:38:08.005373545 +0000 UTC m=+0.074868770 container create d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Sep 30 17:38:08 compute-1 systemd[1]: Started libpod-conmon-d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615.scope.
Sep 30 17:38:08 compute-1 podman[76233]: 2025-09-30 17:38:07.974509325 +0000 UTC m=+0.044004590 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:08 compute-1 ceph-mon[75484]: pgmap v27: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:08 compute-1 podman[76233]: 2025-09-30 17:38:08.142405667 +0000 UTC m=+0.211900902 container init d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Sep 30 17:38:08 compute-1 podman[76233]: 2025-09-30 17:38:08.156645215 +0000 UTC m=+0.226140430 container start d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:08 compute-1 podman[76233]: 2025-09-30 17:38:08.162906105 +0000 UTC m=+0.232401360 container attach d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'orchestrator'
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.248+0000 7fa303278140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.490+0000 7fa303278140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_perf_query'
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.570+0000 7fa303278140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_support'
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.645+0000 7fa303278140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'pg_autoscaler'
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'progress'
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.738+0000 7fa303278140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 awesome_johnson[76249]: --> passed data devices: 0 physical, 1 LVM
Sep 30 17:38:08 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:08 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:08.831+0000 7fa303278140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:38:08 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'prometheus'
Sep 30 17:38:08 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 46225d03-2655-4a6e-a371-e1f19c340cf7
Sep 30 17:38:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_auth_request failed to assign global_id
Sep 30 17:38:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_auth_request failed to assign global_id
Sep 30 17:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e4 e4: 1 total, 0 up, 1 in
Sep 30 17:38:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2393315832' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a6bb176f-a2ce-4022-8226-399d42b79f3f"}]: dispatch
Sep 30 17:38:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2393315832' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "a6bb176f-a2ce-4022-8226-399d42b79f3f"}]': finished
Sep 30 17:38:09 compute-1 ceph-mon[75484]: osdmap e4: 1 total, 0 up, 1 in
Sep 30 17:38:09 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rbd_support'
Sep 30 17:38:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:09.229+0000 7fa303278140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e5 e5: 2 total, 0 up, 2 in
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'restful'
Sep 30 17:38:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:09.338+0000 7fa303278140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Sep 30 17:38:09 compute-1 lvm[76312]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:38:09 compute-1 lvm[76312]: VG ceph_vg0 finished
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rgw'
Sep 30 17:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_auth_request failed to assign global_id
Sep 30 17:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_auth_request failed to assign global_id
Sep 30 17:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e5 _set_new_cache_sizes cache_size:1019954175 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rook'
Sep 30 17:38:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:09.752+0000 7fa303278140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:38:09 compute-1 awesome_johnson[76249]:  stderr: got monmap epoch 2
Sep 30 17:38:09 compute-1 awesome_johnson[76249]: --> Creating keyring file for osd.1
Sep 30 17:38:10 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Sep 30 17:38:10 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Sep 30 17:38:10 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 46225d03-2655-4a6e-a371-e1f19c340cf7 --setuser ceph --setgroup ceph
Sep 30 17:38:10 compute-1 ceph-mon[75484]: pgmap v28: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/891903981' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "46225d03-2655-4a6e-a371-e1f19c340cf7"}]: dispatch
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/891903981' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "46225d03-2655-4a6e-a371-e1f19c340cf7"}]': finished
Sep 30 17:38:10 compute-1 ceph-mon[75484]: osdmap e5: 2 total, 0 up, 2 in
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2319601444' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4199176611' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Sep 30 17:38:10 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'selftest'
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.279+0000 7fa303278140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'snap_schedule'
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.346+0000 7fa303278140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'stats'
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.429+0000 7fa303278140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'status'
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telegraf'
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.574+0000 7fa303278140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.642+0000 7fa303278140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telemetry'
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:10.798+0000 7fa303278140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:38:10 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'test_orchestrator'
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'volumes'
Sep 30 17:38:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:11.029+0000 7fa303278140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-mon[75484]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Sep 30 17:38:11 compute-1 ceph-mon[75484]: Cluster is now healthy
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'zabbix'
Sep 30 17:38:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:11.297+0000 7fa303278140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:38:11.367+0000 7fa303278140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:38:11 compute-1 ceph-mgr[75792]: ms_deliver_dispatch: unhandled message 0x561f6badcd00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Sep 30 17:38:12 compute-1 ceph-mon[75484]: pgmap v31: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:12 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf started
Sep 30 17:38:12 compute-1 awesome_johnson[76249]:  stderr: 2025-09-30T17:38:10.131+0000 7f6b4947f740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) No valid bdev label found
Sep 30 17:38:12 compute-1 awesome_johnson[76249]:  stderr: 2025-09-30T17:38:10.396+0000 7f6b4947f740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Sep 30 17:38:12 compute-1 awesome_johnson[76249]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Sep 30 17:38:12 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: --> ceph-volume lvm activate successful for osd ID: 1
Sep 30 17:38:13 compute-1 awesome_johnson[76249]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Sep 30 17:38:13 compute-1 systemd[1]: libpod-d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615.scope: Deactivated successfully.
Sep 30 17:38:13 compute-1 systemd[1]: libpod-d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615.scope: Consumed 2.521s CPU time.
Sep 30 17:38:13 compute-1 ceph-mon[75484]: mgrmap e8: compute-0.efvthf(active, since 71s), standbys: compute-1.glbusf
Sep 30 17:38:13 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-1.glbusf", "id": "compute-1.glbusf"}]: dispatch
Sep 30 17:38:13 compute-1 podman[77225]: 2025-09-30 17:38:13.436880227 +0000 UTC m=+0.034661535 container died d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Sep 30 17:38:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-b4a48ea8584f3cd930e140d889c14e7569a2f833f04b7fd0eb504877a394360e-merged.mount: Deactivated successfully.
Sep 30 17:38:13 compute-1 podman[77225]: 2025-09-30 17:38:13.495675648 +0000 UTC m=+0.093456946 container remove d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_johnson, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:13 compute-1 systemd[1]: libpod-conmon-d7dde0b4f2434522c3d24f4500fcb1b552dc12ee6a4db98fc9f3d7956d27b615.scope: Deactivated successfully.
Sep 30 17:38:13 compute-1 sudo[76127]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:13 compute-1 sudo[77241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:13 compute-1 sudo[77241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:13 compute-1 sudo[77241]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:13 compute-1 sudo[77266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- lvm list --format json
Sep 30 17:38:13 compute-1 sudo[77266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:14 compute-1 podman[77331]: 2025-09-30 17:38:14.163284179 +0000 UTC m=+0.040735890 container create 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:14 compute-1 systemd[1]: Started libpod-conmon-36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4.scope.
Sep 30 17:38:14 compute-1 podman[77331]: 2025-09-30 17:38:14.146232135 +0000 UTC m=+0.023683836 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:14 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:14 compute-1 podman[77331]: 2025-09-30 17:38:14.263358605 +0000 UTC m=+0.140810356 container init 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:14 compute-1 podman[77331]: 2025-09-30 17:38:14.274897229 +0000 UTC m=+0.152348950 container start 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True)
Sep 30 17:38:14 compute-1 podman[77331]: 2025-09-30 17:38:14.279431342 +0000 UTC m=+0.156883103 container attach 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Sep 30 17:38:14 compute-1 magical_feistel[77348]: 167 167
Sep 30 17:38:14 compute-1 systemd[1]: libpod-36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4.scope: Deactivated successfully.
Sep 30 17:38:14 compute-1 podman[77353]: 2025-09-30 17:38:14.339349734 +0000 UTC m=+0.038072348 container died 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-4db6cf1742a1e76d217e8e75052c79f2f5ae6733273c87dbf41f76412390098f-merged.mount: Deactivated successfully.
Sep 30 17:38:14 compute-1 podman[77353]: 2025-09-30 17:38:14.382352335 +0000 UTC m=+0.081074909 container remove 36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_feistel, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:14 compute-1 systemd[1]: libpod-conmon-36b8e4c450aaf190d72f10b6c07cf23d63ea7a7dd012d929b28b5de377f514a4.scope: Deactivated successfully.
Sep 30 17:38:14 compute-1 ceph-mon[75484]: pgmap v32: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:14 compute-1 podman[77376]: 2025-09-30 17:38:14.636405534 +0000 UTC m=+0.073106552 container create 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True)
Sep 30 17:38:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e5 _set_new_cache_sizes cache_size:1020053526 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:14 compute-1 systemd[1]: Started libpod-conmon-4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce.scope.
Sep 30 17:38:14 compute-1 podman[77376]: 2025-09-30 17:38:14.603089047 +0000 UTC m=+0.039790115 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:14 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862ff39e2f959d3a6fbc6746284f7aaebe58c16d7df4ab57becabb177b0e371c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862ff39e2f959d3a6fbc6746284f7aaebe58c16d7df4ab57becabb177b0e371c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862ff39e2f959d3a6fbc6746284f7aaebe58c16d7df4ab57becabb177b0e371c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/862ff39e2f959d3a6fbc6746284f7aaebe58c16d7df4ab57becabb177b0e371c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:14 compute-1 podman[77376]: 2025-09-30 17:38:14.750341757 +0000 UTC m=+0.187042825 container init 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:14 compute-1 podman[77376]: 2025-09-30 17:38:14.764090351 +0000 UTC m=+0.200791369 container start 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:14 compute-1 podman[77376]: 2025-09-30 17:38:14.768430679 +0000 UTC m=+0.205131757 container attach 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:15 compute-1 nifty_greider[77393]: {
Sep 30 17:38:15 compute-1 nifty_greider[77393]:     "1": [
Sep 30 17:38:15 compute-1 nifty_greider[77393]:         {
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "devices": [
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "/dev/loop3"
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             ],
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "lv_name": "ceph_lv0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "lv_size": "21470642176",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=GzZztY-u9Lh-yq7E-7ylj-fvjc-xAie-SxKXVr,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=63d32c6a-fa18-54ed-8711-9a3915cc367b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=46225d03-2655-4a6e-a371-e1f19c340cf7,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "lv_uuid": "GzZztY-u9Lh-yq7E-7ylj-fvjc-xAie-SxKXVr",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "name": "ceph_lv0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "path": "/dev/ceph_vg0/ceph_lv0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "tags": {
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.block_uuid": "GzZztY-u9Lh-yq7E-7ylj-fvjc-xAie-SxKXVr",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.cephx_lockbox_secret": "",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.cluster_fsid": "63d32c6a-fa18-54ed-8711-9a3915cc367b",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.cluster_name": "ceph",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.crush_device_class": "",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.encrypted": "0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.osd_fsid": "46225d03-2655-4a6e-a371-e1f19c340cf7",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.osd_id": "1",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.osdspec_affinity": "default_drive_group",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.type": "block",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.vdo": "0",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:                 "ceph.with_tpm": "0"
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             },
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "type": "block",
Sep 30 17:38:15 compute-1 nifty_greider[77393]:             "vg_name": "ceph_vg0"
Sep 30 17:38:15 compute-1 nifty_greider[77393]:         }
Sep 30 17:38:15 compute-1 nifty_greider[77393]:     ]
Sep 30 17:38:15 compute-1 nifty_greider[77393]: }
Sep 30 17:38:15 compute-1 systemd[1]: libpod-4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce.scope: Deactivated successfully.
Sep 30 17:38:15 compute-1 podman[77402]: 2025-09-30 17:38:15.164262389 +0000 UTC m=+0.041145082 container died 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 17:38:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-862ff39e2f959d3a6fbc6746284f7aaebe58c16d7df4ab57becabb177b0e371c-merged.mount: Deactivated successfully.
Sep 30 17:38:15 compute-1 podman[77402]: 2025-09-30 17:38:15.204102594 +0000 UTC m=+0.080985287 container remove 4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_greider, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:38:15 compute-1 systemd[1]: libpod-conmon-4a18c63117b9b98c85d6b22c34e9c0669bc5715916c4c842adcb66af44fd3fce.scope: Deactivated successfully.
Sep 30 17:38:15 compute-1 sudo[77266]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:15 compute-1 sudo[77418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:15 compute-1 sudo[77418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:15 compute-1 sudo[77418]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:15 compute-1 sudo[77443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:15 compute-1 sudo[77443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:15 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Sep 30 17:38:15 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:15 compute-1 podman[77510]: 2025-09-30 17:38:15.920593316 +0000 UTC m=+0.056992173 container create 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:15 compute-1 systemd[1]: Started libpod-conmon-2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7.scope.
Sep 30 17:38:15 compute-1 podman[77510]: 2025-09-30 17:38:15.891522634 +0000 UTC m=+0.027921561 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:15 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:16 compute-1 podman[77510]: 2025-09-30 17:38:16.01957269 +0000 UTC m=+0.155971557 container init 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Sep 30 17:38:16 compute-1 podman[77510]: 2025-09-30 17:38:16.031168716 +0000 UTC m=+0.167567583 container start 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:16 compute-1 podman[77510]: 2025-09-30 17:38:16.035431312 +0000 UTC m=+0.171830179 container attach 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:38:16 compute-1 nostalgic_bhabha[77526]: 167 167
Sep 30 17:38:16 compute-1 systemd[1]: libpod-2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7.scope: Deactivated successfully.
Sep 30 17:38:16 compute-1 conmon[77526]: conmon 2bbe71baa77570822cc7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7.scope/container/memory.events
Sep 30 17:38:16 compute-1 podman[77510]: 2025-09-30 17:38:16.039660187 +0000 UTC m=+0.176059054 container died 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 17:38:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-539bf7f616e6d3f3841f8b566a25d43a121f93f8adee8fa93c0993a8e8f95120-merged.mount: Deactivated successfully.
Sep 30 17:38:16 compute-1 podman[77510]: 2025-09-30 17:38:16.089297959 +0000 UTC m=+0.225696816 container remove 2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhabha, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Sep 30 17:38:16 compute-1 systemd[1]: libpod-conmon-2bbe71baa77570822cc7d00d8ead20c0ff2b7a65a9a96b93d2a0a2e104e1e3a7.scope: Deactivated successfully.
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.410716672 +0000 UTC m=+0.063014617 container create 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:16 compute-1 systemd[1]: Started libpod-conmon-17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f.scope.
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.378261479 +0000 UTC m=+0.030559464 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:16 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.528937342 +0000 UTC m=+0.181235297 container init 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.539582742 +0000 UTC m=+0.191880677 container start 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.544063454 +0000 UTC m=+0.196361389 container attach 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test[77573]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Sep 30 17:38:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test[77573]:                             [--no-systemd] [--no-tmpfs]
Sep 30 17:38:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test[77573]: ceph-volume activate: error: unrecognized arguments: --bad-option
Sep 30 17:38:16 compute-1 ceph-mon[75484]: pgmap v33: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:16 compute-1 ceph-mon[75484]: Deploying daemon osd.1 on compute-1
Sep 30 17:38:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Sep 30 17:38:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:16 compute-1 systemd[1]: libpod-17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f.scope: Deactivated successfully.
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.783076083 +0000 UTC m=+0.435374048 container died 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-52a04f2987b88db2189748a3fc9543e082c9ffc72877555c26d601cd74dcb020-merged.mount: Deactivated successfully.
Sep 30 17:38:16 compute-1 podman[77557]: 2025-09-30 17:38:16.844788853 +0000 UTC m=+0.497086788 container remove 17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:16 compute-1 systemd[1]: libpod-conmon-17e6a836a6d164d065c51369f6f5d5818430146a47b08d73352f7de6bea6f39f.scope: Deactivated successfully.
Sep 30 17:38:17 compute-1 systemd[1]: Reloading.
Sep 30 17:38:17 compute-1 systemd-rc-local-generator[77635]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:17 compute-1 systemd-sysv-generator[77638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:17 compute-1 systemd[1]: Reloading.
Sep 30 17:38:17 compute-1 systemd-rc-local-generator[77677]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:38:17 compute-1 systemd-sysv-generator[77680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:38:17 compute-1 systemd[1]: Starting Ceph osd.1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:38:17 compute-1 ceph-mon[75484]: Deploying daemon osd.0 on compute-0
Sep 30 17:38:17 compute-1 ceph-mon[75484]: pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:18 compute-1 podman[77732]: 2025-09-30 17:38:18.118810129 +0000 UTC m=+0.063763598 container create ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Sep 30 17:38:18 compute-1 podman[77732]: 2025-09-30 17:38:18.089554502 +0000 UTC m=+0.034508041 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:18 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:18 compute-1 podman[77732]: 2025-09-30 17:38:18.212320415 +0000 UTC m=+0.157273964 container init ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Sep 30 17:38:18 compute-1 podman[77732]: 2025-09-30 17:38:18.224167168 +0000 UTC m=+0.169120637 container start ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid)
Sep 30 17:38:18 compute-1 podman[77732]: 2025-09-30 17:38:18.228157937 +0000 UTC m=+0.173111466 container attach ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:18 compute-1 bash[77732]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:18 compute-1 bash[77732]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:18 compute-1 lvm[77829]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:38:18 compute-1 lvm[77829]: VG ceph_vg0 finished
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: --> Failed to activate via raw: did not find any matching OSD to activate
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:19 compute-1 bash[77732]: --> Failed to activate via raw: did not find any matching OSD to activate
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/ceph-authtool --gen-print-key
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:19 compute-1 bash[77732]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Sep 30 17:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate[77748]: --> ceph-volume lvm activate successful for osd ID: 1
Sep 30 17:38:19 compute-1 bash[77732]: --> ceph-volume lvm activate successful for osd ID: 1
Sep 30 17:38:19 compute-1 systemd[1]: libpod-ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb.scope: Deactivated successfully.
Sep 30 17:38:19 compute-1 podman[77732]: 2025-09-30 17:38:19.57593701 +0000 UTC m=+1.520890509 container died ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:19 compute-1 systemd[1]: libpod-ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb.scope: Consumed 1.521s CPU time.
Sep 30 17:38:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-604a5db5bd4cfc678f8299fa7af250793a8545f8eb7c7fd3c0195668b27cb549-merged.mount: Deactivated successfully.
Sep 30 17:38:19 compute-1 podman[77732]: 2025-09-30 17:38:19.622446756 +0000 UTC m=+1.567400215 container remove ce1e7716482fa5c753c609ef3bacceadde4d06bd28a4db17e2bd487a1dc935fb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Sep 30 17:38:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e5 _set_new_cache_sizes cache_size:1020054717 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:19 compute-1 podman[77986]: 2025-09-30 17:38:19.837717699 +0000 UTC m=+0.041354428 container create f26debbca63b2c766fecd170a59452a0e95e36e854632dd8bad0e8b948fe54eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Sep 30 17:38:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec06e961803b48d279ec1685c98cfe492a92792d8e236dd6b20ada50bc2ee833/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec06e961803b48d279ec1685c98cfe492a92792d8e236dd6b20ada50bc2ee833/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec06e961803b48d279ec1685c98cfe492a92792d8e236dd6b20ada50bc2ee833/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec06e961803b48d279ec1685c98cfe492a92792d8e236dd6b20ada50bc2ee833/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec06e961803b48d279ec1685c98cfe492a92792d8e236dd6b20ada50bc2ee833/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:19 compute-1 podman[77986]: 2025-09-30 17:38:19.893239761 +0000 UTC m=+0.096876510 container init f26debbca63b2c766fecd170a59452a0e95e36e854632dd8bad0e8b948fe54eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:38:19 compute-1 podman[77986]: 2025-09-30 17:38:19.899208483 +0000 UTC m=+0.102845222 container start f26debbca63b2c766fecd170a59452a0e95e36e854632dd8bad0e8b948fe54eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:19 compute-1 bash[77986]: f26debbca63b2c766fecd170a59452a0e95e36e854632dd8bad0e8b948fe54eb
Sep 30 17:38:19 compute-1 podman[77986]: 2025-09-30 17:38:19.821339113 +0000 UTC m=+0.024975882 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:19 compute-1 systemd[1]: Started Ceph osd.1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:38:19 compute-1 ceph-osd[78006]: set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:38:19 compute-1 ceph-osd[78006]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Sep 30 17:38:19 compute-1 ceph-osd[78006]: pidfile_write: ignore empty --pid-file
Sep 30 17:38:19 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:19 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:19 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:19 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:19 compute-1 sudo[77443]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:20 compute-1 ceph-mon[75484]: pgmap v35: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:20 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:20 compute-1 sudo[78024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:20 compute-1 sudo[78024]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:20 compute-1 sudo[78024]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:20 compute-1 sudo[78049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- raw list --format json
Sep 30 17:38:20 compute-1 sudo[78049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503800 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f31503c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.394222206 +0000 UTC m=+0.062985636 container create 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Sep 30 17:38:21 compute-1 systemd[1]: Started libpod-conmon-699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131.scope.
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.368863296 +0000 UTC m=+0.037626696 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.504054697 +0000 UTC m=+0.172818187 container init 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.514544563 +0000 UTC m=+0.183307993 container start 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.51846648 +0000 UTC m=+0.187229970 container attach 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Sep 30 17:38:21 compute-1 focused_burnell[78137]: 167 167
Sep 30 17:38:21 compute-1 systemd[1]: libpod-699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131.scope: Deactivated successfully.
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.523197289 +0000 UTC m=+0.191960719 container died 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:38:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-e70a8424a827bdfb8e40e3c11b416ea2beff77fb92dac47c8206d1534b294f3d-merged.mount: Deactivated successfully.
Sep 30 17:38:21 compute-1 ceph-osd[78006]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Sep 30 17:38:21 compute-1 podman[78118]: 2025-09-30 17:38:21.564121463 +0000 UTC m=+0.232884873 container remove 699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_burnell, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0)
Sep 30 17:38:21 compute-1 ceph-osd[78006]: load: jerasure load: lrc 
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:21 compute-1 systemd[1]: libpod-conmon-699c9a3467bde5db1a64628752a4c703fb63949896b761eb602b36931b079131.scope: Deactivated successfully.
Sep 30 17:38:21 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:21 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:21 compute-1 podman[78165]: 2025-09-30 17:38:21.733812064 +0000 UTC m=+0.046787635 container create 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:21 compute-1 systemd[1]: Started libpod-conmon-7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0.scope.
Sep 30 17:38:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16bea653a9264694e96a18c09acaa6d8f8670d66f19fe34701c8473d560e3524/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16bea653a9264694e96a18c09acaa6d8f8670d66f19fe34701c8473d560e3524/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16bea653a9264694e96a18c09acaa6d8f8670d66f19fe34701c8473d560e3524/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16bea653a9264694e96a18c09acaa6d8f8670d66f19fe34701c8473d560e3524/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:21 compute-1 podman[78165]: 2025-09-30 17:38:21.80967836 +0000 UTC m=+0.122653941 container init 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Sep 30 17:38:21 compute-1 podman[78165]: 2025-09-30 17:38:21.715239699 +0000 UTC m=+0.028215310 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:21 compute-1 podman[78165]: 2025-09-30 17:38:21.822323595 +0000 UTC m=+0.135299166 container start 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:21 compute-1 podman[78165]: 2025-09-30 17:38:21.834879697 +0000 UTC m=+0.147855288 container attach 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Sep 30 17:38:21 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:22 compute-1 ceph-osd[78006]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:22 compute-1 lvm[78268]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:38:22 compute-1 lvm[78268]: VG ceph_vg0 finished
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32374c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount shared_bdev_used = 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: RocksDB version: 7.9.2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Git sha 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Compile date 2025-07-17 03:12:14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DB SUMMARY
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DB Session ID:  8D7RONMQCJA4ELL0UAOP
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: CURRENT file:  CURRENT
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: IDENTITY file:  IDENTITY
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.error_if_exists: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.create_if_missing: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.paranoid_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.flush_verify_memtable_count: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                     Options.env: 0x556f315576c0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                      Options.fs: LegacyFileSystem
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                Options.info_log: 0x556f32379760
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_file_opening_threads: 16
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.statistics: (nil)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.use_fsync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.max_log_file_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.log_file_time_to_roll: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.keep_log_file_num: 1000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.recycle_log_file_num: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.allow_fallocate: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.allow_mmap_reads: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.allow_mmap_writes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.use_direct_reads: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.create_missing_column_families: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.db_log_dir: 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                 Options.wal_dir: db.wal
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.table_cache_numshardbits: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.WAL_ttl_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.WAL_size_limit_MB: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.manifest_preallocation_size: 4194304
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                     Options.is_fd_close_on_exec: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.advise_random_on_open: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.db_write_buffer_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.write_buffer_manager: 0x556f32470a00
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.access_hint_on_compaction_start: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                      Options.use_adaptive_mutex: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                            Options.rate_limiter: (nil)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.wal_recovery_mode: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.enable_thread_tracking: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.enable_pipelined_write: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.unordered_write: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.write_thread_max_yield_usec: 100
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.row_cache: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.wal_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_flush_during_recovery: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.allow_ingest_behind: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.two_write_queues: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.manual_wal_flush: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.wal_compression: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.atomic_flush: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.persist_stats_to_disk: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.write_dbid_to_manifest: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.log_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.best_efforts_recovery: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.allow_data_in_errors: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.db_host_id: __hostname__
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.enforce_single_del_contracts: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_background_jobs: 4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_background_compactions: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_subcompactions: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.writable_file_max_buffer_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.delayed_write_rate : 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_total_wal_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.stats_dump_period_sec: 600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.stats_persist_period_sec: 600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.max_open_files: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                      Options.wal_bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.strict_bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.compaction_readahead_size: 2097152
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.max_background_flushes: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Compression algorithms supported:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZSTD supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kXpressCompression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kBZip2Compression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZSTDNotFinalCompression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kLZ4Compression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZlibCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kLZ4HCCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kSnappyCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Fast CRC32 supported: Supported on x86
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DMutex implementation: pthread_mutex_t
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 busy_greider[78181]: {}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b20)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b40)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b40)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379b40)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 99ea9e38-6596-490a-bcfd-ba29b265991e
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902503227, "job": 1, "event": "recovery_started", "wal_files": [31]}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902503540, "job": 1, "event": "recovery_finished"}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: freelist init
Sep 30 17:38:22 compute-1 ceph-osd[78006]: freelist _read_cfg
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs umount
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) close
Sep 30 17:38:22 compute-1 systemd[1]: libpod-7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0.scope: Deactivated successfully.
Sep 30 17:38:22 compute-1 systemd[1]: libpod-7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0.scope: Consumed 1.062s CPU time.
Sep 30 17:38:22 compute-1 podman[78165]: 2025-09-30 17:38:22.521175016 +0000 UTC m=+0.834150577 container died 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Sep 30 17:38:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-16bea653a9264694e96a18c09acaa6d8f8670d66f19fe34701c8473d560e3524-merged.mount: Deactivated successfully.
Sep 30 17:38:22 compute-1 podman[78165]: 2025-09-30 17:38:22.557441044 +0000 UTC m=+0.870416605 container remove 7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_greider, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1)
Sep 30 17:38:22 compute-1 systemd[1]: libpod-conmon-7a15c9083cd737251f12e9bb0bd4d0cd617e6fc86eaaa9cc806f10c22a16a0e0.scope: Deactivated successfully.
Sep 30 17:38:22 compute-1 sudo[78049]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bdev(0x556f32375000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluefs mount shared_bdev_used = 4718592
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: RocksDB version: 7.9.2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Git sha 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Compile date 2025-07-17 03:12:14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DB SUMMARY
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DB Session ID:  8D7RONMQCJA4ELL0UAOO
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: CURRENT file:  CURRENT
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: IDENTITY file:  IDENTITY
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.error_if_exists: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.create_if_missing: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.paranoid_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.flush_verify_memtable_count: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                     Options.env: 0x556f315571f0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                      Options.fs: LegacyFileSystem
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                Options.info_log: 0x556f32379900
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_file_opening_threads: 16
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.statistics: (nil)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.use_fsync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.max_log_file_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.log_file_time_to_roll: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.keep_log_file_num: 1000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.recycle_log_file_num: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.allow_fallocate: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.allow_mmap_reads: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.allow_mmap_writes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.use_direct_reads: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.create_missing_column_families: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.db_log_dir: 
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                                 Options.wal_dir: db.wal
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.table_cache_numshardbits: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                         Options.WAL_ttl_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.WAL_size_limit_MB: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.manifest_preallocation_size: 4194304
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                     Options.is_fd_close_on_exec: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.advise_random_on_open: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.db_write_buffer_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.write_buffer_manager: 0x556f32470a00
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.access_hint_on_compaction_start: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                      Options.use_adaptive_mutex: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                            Options.rate_limiter: (nil)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.wal_recovery_mode: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.enable_thread_tracking: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.enable_pipelined_write: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.unordered_write: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.write_thread_max_yield_usec: 100
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.row_cache: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                              Options.wal_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_flush_during_recovery: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.allow_ingest_behind: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.two_write_queues: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.manual_wal_flush: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.wal_compression: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.atomic_flush: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.persist_stats_to_disk: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.write_dbid_to_manifest: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.log_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.best_efforts_recovery: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.allow_data_in_errors: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.db_host_id: __hostname__
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.enforce_single_del_contracts: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_background_jobs: 4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_background_compactions: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_subcompactions: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.writable_file_max_buffer_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.delayed_write_rate : 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.max_total_wal_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.stats_dump_period_sec: 600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.stats_persist_period_sec: 600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.max_open_files: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                      Options.wal_bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.strict_bytes_per_sync: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.compaction_readahead_size: 2097152
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.max_background_flushes: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Compression algorithms supported:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZSTD supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kXpressCompression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kBZip2Compression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZSTDNotFinalCompression supported: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kLZ4Compression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kZlibCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kLZ4HCCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         kSnappyCompression supported: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Fast CRC32 supported: Supported on x86
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DMutex implementation: pthread_mutex_t
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379640)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f31599350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379a80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379a80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:           Options.merge_operator: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.compaction_filter_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.sst_partitioner_factory: None
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.memtable_factory: SkipListFactory
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.table_factory: BlockBasedTable
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f32379a80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x556f315989b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.write_buffer_size: 16777216
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.max_write_buffer_number: 64
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.compression: LZ4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression: Disabled
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.num_levels: 7
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:            Options.compression_opts.window_bits: -14
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.level: 32767
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.compression_opts.strategy: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.parallel_threads: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                  Options.compression_opts.enabled: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:              Options.level0_stop_writes_trigger: 36
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.target_file_size_base: 67108864
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:             Options.target_file_size_multiplier: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.arena_block_size: 1048576
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.disable_auto_compactions: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.inplace_update_support: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                 Options.inplace_update_num_locks: 10000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:               Options.memtable_whole_key_filtering: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:   Options.memtable_huge_page_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.bloom_locality: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                    Options.max_successive_merges: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.optimize_filters_for_hits: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.paranoid_file_checks: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.force_consistency_checks: 1
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.report_bg_io_stats: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                               Options.ttl: 2592000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.periodic_compaction_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:    Options.preserve_internal_time_seconds: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                       Options.enable_blob_files: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                           Options.min_blob_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                          Options.blob_file_size: 268435456
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                   Options.blob_compression_type: NoCompression
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.enable_blob_garbage_collection: false
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:          Options.blob_compaction_readahead_size: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb:                Options.blob_file_starting_level: 0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 99ea9e38-6596-490a-bcfd-ba29b265991e
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902767784, "job": 1, "event": "recovery_started", "wal_files": [31]}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902771392, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253902, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "99ea9e38-6596-490a-bcfd-ba29b265991e", "db_session_id": "8D7RONMQCJA4ELL0UAOO", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902774013, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253902, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "99ea9e38-6596-490a-bcfd-ba29b265991e", "db_session_id": "8D7RONMQCJA4ELL0UAOO", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902776956, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253902, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "99ea9e38-6596-490a-bcfd-ba29b265991e", "db_session_id": "8D7RONMQCJA4ELL0UAOO", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759253902778581, "job": 1, "event": "recovery_finished"}
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x556f32574000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: DB pointer 0x556f3251e000
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Sep 30 17:38:22 compute-1 ceph-osd[78006]: bluestore.MempoolThread fragmentation_score=0.000017 took=0.000017s
Sep 30 17:38:22 compute-1 ceph-osd[78006]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Sep 30 17:38:22 compute-1 ceph-osd[78006]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Sep 30 17:38:22 compute-1 ceph-osd[78006]: _get_class not permitted to load lua
Sep 30 17:38:22 compute-1 ceph-osd[78006]: _get_class not permitted to load sdk
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 load_pgs
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 load_pgs opened 0 pgs
Sep 30 17:38:22 compute-1 ceph-osd[78006]: osd.1 0 log_to_monitors true
Sep 30 17:38:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1[78002]: 2025-09-30T17:38:22.801+0000 7fae061d4740 -1 osd.1 0 log_to_monitors true
Sep 30 17:38:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} v 0)
Sep 30 17:38:23 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.101:6800/4235751489,v1:192.168.122.101:6801/4235751489]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Sep 30 17:38:23 compute-1 ceph-mon[75484]: pgmap v36: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Sep 30 17:38:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Sep 30 17:38:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e6 e6: 2 total, 0 up, 2 in
Sep 30 17:38:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]} v 0)
Sep 30 17:38:24 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.101:6800/4235751489,v1:192.168.122.101:6801/4235751489]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: pgmap v37: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='osd.1 [v2:192.168.122.101:6800/4235751489,v1:192.168.122.101:6801/4235751489]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='osd.1 [v2:192.168.122.101:6800/4235751489,v1:192.168.122.101:6801/4235751489]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: osdmap e6: 2 total, 0 up, 2 in
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Sep 30 17:38:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e7 e7: 2 total, 0 up, 2 in
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 done with init, starting boot process
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 start_boot
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Sep 30 17:38:25 compute-1 ceph-osd[78006]: osd.1 0  bench count 12288000 bsize 4 KiB
Sep 30 17:38:26 compute-1 ceph-mon[75484]: purged_snaps scrub starts
Sep 30 17:38:26 compute-1 ceph-mon[75484]: purged_snaps scrub ok
Sep 30 17:38:26 compute-1 ceph-mon[75484]: pgmap v39: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:26 compute-1 ceph-mon[75484]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Sep 30 17:38:26 compute-1 ceph-mon[75484]: osdmap e7: 2 total, 0 up, 2 in
Sep 30 17:38:26 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:26 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:26 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:27 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:27 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:27 compute-1 ceph-mon[75484]: pgmap v41: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:27 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} v 0)
Sep 30 17:38:27 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/129916338,v1:192.168.122.100:6803/129916338]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Sep 30 17:38:28 compute-1 sudo[78700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:38:28 compute-1 sudo[78700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:28 compute-1 sudo[78700]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:28 compute-1 sudo[78725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:28 compute-1 sudo[78725]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:28 compute-1 sudo[78725]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:28 compute-1 sudo[78750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:38:28 compute-1 sudo[78750]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:28 compute-1 ceph-mon[75484]: from='osd.0 [v2:192.168.122.100:6802/129916338,v1:192.168.122.100:6803/129916338]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Sep 30 17:38:28 compute-1 ceph-mon[75484]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Sep 30 17:38:28 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:28 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:28 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e8 e8: 2 total, 0 up, 2 in
Sep 30 17:38:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0)
Sep 30 17:38:28 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/129916338,v1:192.168.122.100:6803/129916338]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Sep 30 17:38:28 compute-1 podman[78846]: 2025-09-30 17:38:28.961070502 +0000 UTC m=+0.089633002 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Sep 30 17:38:29 compute-1 podman[78846]: 2025-09-30 17:38:29.051417622 +0000 UTC m=+0.179980062 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 38.298 iops: 9804.413 elapsed_sec: 0.306
Sep 30 17:38:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [WRN] : OSD bench result of 9804.413295 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 0 waiting for initial osdmap
Sep 30 17:38:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1[78002]: 2025-09-30T17:38:29.202+0000 7fae02157640 -1 osd.1 0 waiting for initial osdmap
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 check_osdmap_features require_osd_release unknown -> squid
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 set_numa_affinity not setting numa affinity
Sep 30 17:38:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-osd-1[78002]: 2025-09-30T17:38:29.225+0000 7fadfd77f640 -1 osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Sep 30 17:38:29 compute-1 sudo[78750]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:29 compute-1 sudo[78934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:29 compute-1 sudo[78934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:29 compute-1 sudo[78934]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:29 compute-1 sudo[78959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- inventory --format=json-pretty --filter-for-batch
Sep 30 17:38:29 compute-1 sudo[78959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e8 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:29 compute-1 ceph-mon[75484]: pgmap v42: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Sep 30 17:38:29 compute-1 ceph-mon[75484]: osdmap e8: 2 total, 0 up, 2 in
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='osd.0 [v2:192.168.122.100:6802/129916338,v1:192.168.122.100:6803/129916338]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:29 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e9 e9: 2 total, 1 up, 2 in
Sep 30 17:38:29 compute-1 ceph-osd[78006]: osd.1 9 state: booting -> active
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.868283418 +0000 UTC m=+0.044704088 container create 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:29 compute-1 systemd[1]: Started libpod-conmon-451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8.scope.
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.850180745 +0000 UTC m=+0.026601435 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:29 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.96383963 +0000 UTC m=+0.140260310 container init 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.970144562 +0000 UTC m=+0.146565222 container start 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.973466933 +0000 UTC m=+0.149887613 container attach 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Sep 30 17:38:29 compute-1 sleepy_davinci[79039]: 167 167
Sep 30 17:38:29 compute-1 systemd[1]: libpod-451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8.scope: Deactivated successfully.
Sep 30 17:38:29 compute-1 podman[79023]: 2025-09-30 17:38:29.97629702 +0000 UTC m=+0.152717710 container died 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:38:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-b94accd4f7e41cd79eab08440805bb6827d659e6c5400d1d07f00bbd5db47852-merged.mount: Deactivated successfully.
Sep 30 17:38:30 compute-1 podman[79023]: 2025-09-30 17:38:30.022051356 +0000 UTC m=+0.198472016 container remove 451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_davinci, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Sep 30 17:38:30 compute-1 systemd[1]: libpod-conmon-451ca36bcddfe05e681086544e0c3f467d5c28bc8a1c50e9e39c72f05adb5ef8.scope: Deactivated successfully.
Sep 30 17:38:30 compute-1 podman[79063]: 2025-09-30 17:38:30.207425454 +0000 UTC m=+0.049668874 container create 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Sep 30 17:38:30 compute-1 systemd[1]: Started libpod-conmon-00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a.scope.
Sep 30 17:38:30 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f32ab1568fdbf2bfdb22e4becb16c850500c498bdb02a3b4ddc0b38cac4c7c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:30 compute-1 podman[79063]: 2025-09-30 17:38:30.189933648 +0000 UTC m=+0.032177098 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f32ab1568fdbf2bfdb22e4becb16c850500c498bdb02a3b4ddc0b38cac4c7c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f32ab1568fdbf2bfdb22e4becb16c850500c498bdb02a3b4ddc0b38cac4c7c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f32ab1568fdbf2bfdb22e4becb16c850500c498bdb02a3b4ddc0b38cac4c7c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:38:30 compute-1 podman[79063]: 2025-09-30 17:38:30.29875283 +0000 UTC m=+0.140996270 container init 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Sep 30 17:38:30 compute-1 podman[79063]: 2025-09-30 17:38:30.307280162 +0000 UTC m=+0.149523592 container start 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 17:38:30 compute-1 podman[79063]: 2025-09-30 17:38:30.310847519 +0000 UTC m=+0.153090949 container attach 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Sep 30 17:38:30 compute-1 ceph-mon[75484]: OSD bench result of 9804.413295 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Sep 30 17:38:30 compute-1 ceph-mon[75484]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Sep 30 17:38:30 compute-1 ceph-mon[75484]: osd.1 [v2:192.168.122.101:6800/4235751489,v1:192.168.122.101:6801/4235751489] boot
Sep 30 17:38:30 compute-1 ceph-mon[75484]: osdmap e9: 2 total, 1 up, 2 in
Sep 30 17:38:30 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:30 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:38:30 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e10 e10: 2 total, 1 up, 2 in
Sep 30 17:38:31 compute-1 tender_lamarr[79079]: [
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:     {
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "available": false,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "being_replaced": false,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "ceph_device_lvm": false,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "device_id": "QEMU_DVD-ROM_QM00001",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "lsm_data": {},
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "lvs": [],
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "path": "/dev/sr0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "rejected_reasons": [
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "Insufficient space (<5GB)",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "Has a FileSystem"
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         ],
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         "sys_api": {
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "actuators": null,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "device_nodes": [
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:                 "sr0"
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             ],
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "devname": "sr0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "human_readable_size": "482.00 KB",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "id_bus": "ata",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "model": "QEMU DVD-ROM",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "nr_requests": "2",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "parent": "/dev/sr0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "partitions": {},
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "path": "/dev/sr0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "removable": "1",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "rev": "2.5+",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "ro": "0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "rotational": "0",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "sas_address": "",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "sas_device_handle": "",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "scheduler_mode": "mq-deadline",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "sectors": 0,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "sectorsize": "2048",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "size": 493568.0,
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "support_discard": "2048",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "type": "disk",
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:             "vendor": "QEMU"
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:         }
Sep 30 17:38:31 compute-1 tender_lamarr[79079]:     }
Sep 30 17:38:31 compute-1 tender_lamarr[79079]: ]
Sep 30 17:38:31 compute-1 systemd[1]: libpod-00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a.scope: Deactivated successfully.
Sep 30 17:38:31 compute-1 podman[79063]: 2025-09-30 17:38:31.142518808 +0000 UTC m=+0.984762298 container died 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-33f32ab1568fdbf2bfdb22e4becb16c850500c498bdb02a3b4ddc0b38cac4c7c-merged.mount: Deactivated successfully.
Sep 30 17:38:31 compute-1 podman[79063]: 2025-09-30 17:38:31.190924976 +0000 UTC m=+1.033168396 container remove 00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_lamarr, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:31 compute-1 systemd[1]: libpod-conmon-00c424ed0488a7fa93e29cc268acd8cd38b6be54b5d2df6ae3c1f0752523fd7a.scope: Deactivated successfully.
Sep 30 17:38:31 compute-1 sudo[78959]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:31 compute-1 ceph-mon[75484]: purged_snaps scrub starts
Sep 30 17:38:31 compute-1 ceph-mon[75484]: purged_snaps scrub ok
Sep 30 17:38:31 compute-1 ceph-mon[75484]: pgmap v45: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:31 compute-1 ceph-mon[75484]: osdmap e10: 2 total, 1 up, 2 in
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:31 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:38:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e11 e11: 2 total, 1 up, 2 in
Sep 30 17:38:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e11 crush map has features 3314933000852226048, adjusting msgr requires
Sep 30 17:38:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Sep 30 17:38:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Sep 30 17:38:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e11 crush map has features 288514051259236352, adjusting msgr requires
Sep 30 17:38:32 compute-1 ceph-osd[78006]: osd.1 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Sep 30 17:38:32 compute-1 ceph-osd[78006]: osd.1 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Sep 30 17:38:32 compute-1 ceph-osd[78006]: osd.1 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Sep 30 17:38:33 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-1 to 127.8M
Sep 30 17:38:33 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-1 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Sep 30 17:38:33 compute-1 ceph-mon[75484]: osdmap e11: 2 total, 1 up, 2 in
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:33 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e12 e12: 2 total, 1 up, 2 in
Sep 30 17:38:33 compute-1 sudo[80171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:38:33 compute-1 sudo[80171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:33 compute-1 sudo[80171]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:33 compute-1 sudo[80196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:38:33 compute-1 sudo[80196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:33 compute-1 sudo[80196]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:33 compute-1 sudo[80221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:38:33 compute-1 sudo[80221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:33 compute-1 sudo[80221]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:33 compute-1 sudo[80246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:33 compute-1 sudo[80246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:33 compute-1 sudo[80246]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:33 compute-1 sudo[80271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:38:33 compute-1 sudo[80271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:33 compute-1 sudo[80271]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80319]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80344]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Sep 30 17:38:34 compute-1 sudo[80369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80369]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 ceph-mon[75484]: pgmap v48: 1 pgs: 1 unknown; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:34 compute-1 ceph-mon[75484]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Sep 30 17:38:34 compute-1 ceph-mon[75484]: osdmap e12: 2 total, 1 up, 2 in
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Sep 30 17:38:34 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-0 to 127.8M
Sep 30 17:38:34 compute-1 sudo[80394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:38:34 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-0 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:38:34 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.conf
Sep 30 17:38:34 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.conf
Sep 30 17:38:34 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:34 compute-1 sudo[80394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80394]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:38:34 compute-1 sudo[80419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80419]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80444]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:34 compute-1 sudo[80469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80469]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e12 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:34 compute-1 sudo[80494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80494]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80542]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80542]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:38:34 compute-1 sudo[80567]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80567]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:34 compute-1 sudo[80592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:38:34 compute-1 sudo[80592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:34 compute-1 sudo[80592]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:35 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:38:35 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:38:35 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:36 compute-1 ceph-mon[75484]: pgmap v50: 1 pgs: 1 unknown; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Sep 30 17:38:36 compute-1 ceph-mon[75484]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Sep 30 17:38:36 compute-1 ceph-mon[75484]: Cluster is now healthy
Sep 30 17:38:36 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:37 compute-1 ceph-mon[75484]: pgmap v51: 1 pgs: 1 unknown; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Sep 30 17:38:37 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3985569669' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Sep 30 17:38:37 compute-1 ceph-mon[75484]: OSD bench result of 3935.746627 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Sep 30 17:38:37 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e13 e13: 2 total, 2 up, 2 in
Sep 30 17:38:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e14 e14: 2 total, 2 up, 2 in
Sep 30 17:38:39 compute-1 ceph-mon[75484]: osd.0 [v2:192.168.122.100:6802/129916338,v1:192.168.122.100:6803/129916338] boot
Sep 30 17:38:39 compute-1 ceph-mon[75484]: osdmap e13: 2 total, 2 up, 2 in
Sep 30 17:38:39 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:38:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4127359789' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:39 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Sep 30 17:38:39 compute-1 sudo[80618]:     ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda
Sep 30 17:38:39 compute-1 sudo[80618]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 17:38:39 compute-1 sudo[80618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167)
Sep 30 17:38:39 compute-1 sudo[80618]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:39 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Sep 30 17:38:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e14 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e15 e15: 2 total, 2 up, 2 in
Sep 30 17:38:40 compute-1 ceph-mon[75484]: pgmap v53: 1 pgs: 1 unknown; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4127359789' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:40 compute-1 ceph-mon[75484]: osdmap e14: 2 total, 2 up, 2 in
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2260967211' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2260967211' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:40 compute-1 ceph-mon[75484]: osdmap e15: 2 total, 2 up, 2 in
Sep 30 17:38:40 compute-1 sudo[80621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:38:40 compute-1 sudo[80621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:40 compute-1 sudo[80621]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:40 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 15 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:38:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e16 e16: 2 total, 2 up, 2 in
Sep 30 17:38:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 16 pg[4.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:38:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 16 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:41 compute-1 ceph-mon[75484]: Reconfiguring mon.compute-0 (monmap changed)...
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: Reconfiguring daemon mon.compute-0 on compute-0
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.efvthf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/411356327' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/411356327' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:41 compute-1 ceph-mon[75484]: osdmap e16: 2 total, 2 up, 2 in
Sep 30 17:38:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Sep 30 17:38:42 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 17 pg[5.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [1] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:38:42 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 17 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:38:42 compute-1 ceph-mon[75484]: pgmap v56: 3 pgs: 2 unknown, 1 creating+peering; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:42 compute-1 ceph-mon[75484]: Reconfiguring mgr.compute-0.efvthf (monmap changed)...
Sep 30 17:38:42 compute-1 ceph-mon[75484]: Reconfiguring daemon mgr.compute-0.efvthf on compute-0
Sep 30 17:38:42 compute-1 ceph-mon[75484]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:38:42 compute-1 ceph-mon[75484]: mgrmap e9: compute-0.efvthf(active, since 100s), standbys: compute-1.glbusf
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:42 compute-1 ceph-mon[75484]: Reconfiguring crash.compute-0 (monmap changed)...
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:42 compute-1 ceph-mon[75484]: Reconfiguring daemon crash.compute-0 on compute-0
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3276171411' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3276171411' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:42 compute-1 ceph-mon[75484]: osdmap e17: 2 total, 2 up, 2 in
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Sep 30 17:38:42 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:42 compute-1 sudo[80646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:42 compute-1 sudo[80646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:42 compute-1 sudo[80646]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:42 compute-1 sudo[80671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:42 compute-1 sudo[80671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.6343316 +0000 UTC m=+0.054415322 container create b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Sep 30 17:38:42 compute-1 systemd[1]: Started libpod-conmon-b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3.scope.
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.607587822 +0000 UTC m=+0.027671654 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:42 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.730821658 +0000 UTC m=+0.150905430 container init b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.737271854 +0000 UTC m=+0.157355596 container start b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.741422257 +0000 UTC m=+0.161505989 container attach b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:38:42 compute-1 quizzical_wright[80730]: 167 167
Sep 30 17:38:42 compute-1 systemd[1]: libpod-b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3.scope: Deactivated successfully.
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.742447075 +0000 UTC m=+0.162530837 container died b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Sep 30 17:38:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-8fba7bf8b9b179883c903869c0f869e345ea531cfdfb5363dd87eb0c42106361-merged.mount: Deactivated successfully.
Sep 30 17:38:42 compute-1 podman[80713]: 2025-09-30 17:38:42.782409783 +0000 UTC m=+0.202493535 container remove b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_wright, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Sep 30 17:38:42 compute-1 systemd[1]: libpod-conmon-b750ad304c1742fba678dfed2c6ce2a884279178a874b6d23dac810ae3fb40c3.scope: Deactivated successfully.
Sep 30 17:38:42 compute-1 sudo[80671]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:42 compute-1 sudo[80746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:42 compute-1 sudo[80746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:42 compute-1 sudo[80746]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:43 compute-1 sudo[80771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:38:43 compute-1 sudo[80771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Sep 30 17:38:43 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 18 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [1] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:38:43 compute-1 ceph-mon[75484]: Reconfiguring mon.compute-1 (monmap changed)...
Sep 30 17:38:43 compute-1 ceph-mon[75484]: Reconfiguring daemon mon.compute-1 on compute-1
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.glbusf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:43 compute-1 ceph-mon[75484]: osdmap e18: 2 total, 2 up, 2 in
Sep 30 17:38:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1184914315' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.326172221 +0000 UTC m=+0.047091623 container create a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:43 compute-1 systemd[1]: Started libpod-conmon-a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd.scope.
Sep 30 17:38:43 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.301824538 +0000 UTC m=+0.022744040 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.415240147 +0000 UTC m=+0.136159569 container init a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.422142545 +0000 UTC m=+0.143061947 container start a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.42526473 +0000 UTC m=+0.146184192 container attach a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:38:43 compute-1 focused_gould[80827]: 167 167
Sep 30 17:38:43 compute-1 systemd[1]: libpod-a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd.scope: Deactivated successfully.
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.427738247 +0000 UTC m=+0.148657659 container died a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Sep 30 17:38:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-e5acc5c14b4ed8188125bc1101e9625840fab2293495165c82b870e1fb64bcc9-merged.mount: Deactivated successfully.
Sep 30 17:38:43 compute-1 podman[80811]: 2025-09-30 17:38:43.465089514 +0000 UTC m=+0.186008966 container remove a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_gould, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Sep 30 17:38:43 compute-1 systemd[1]: libpod-conmon-a5a08860b251b2e64df3ab6c7d8ee828cbf8aa86f56bf73a97ecfa39a27dd2bd.scope: Deactivated successfully.
Sep 30 17:38:43 compute-1 sudo[80771]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:43 compute-1 sudo[80844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:43 compute-1 sudo[80844]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:43 compute-1 sudo[80844]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:43 compute-1 sudo[80869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:38:43 compute-1 sudo[80869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Sep 30 17:38:44 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 19 pg[6.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [1] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:38:44 compute-1 podman[80967]: 2025-09-30 17:38:44.362308748 +0000 UTC m=+0.071892609 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1)
Sep 30 17:38:44 compute-1 podman[80967]: 2025-09-30 17:38:44.486914822 +0000 UTC m=+0.196498683 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:38:44 compute-1 ceph-mon[75484]: pgmap v59: 5 pgs: 4 unknown, 1 creating+peering; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:44 compute-1 ceph-mon[75484]: Reconfiguring mgr.compute-1.glbusf (monmap changed)...
Sep 30 17:38:44 compute-1 ceph-mon[75484]: Reconfiguring daemon mgr.compute-1.glbusf on compute-1
Sep 30 17:38:44 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:44 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1184914315' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:44 compute-1 ceph-mon[75484]: osdmap e19: 2 total, 2 up, 2 in
Sep 30 17:38:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e19 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:44 compute-1 sudo[80869]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:45 compute-1 sudo[81054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:45 compute-1 sudo[81054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:45 compute-1 sudo[81054]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e20 e20: 2 total, 2 up, 2 in
Sep 30 17:38:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 20 pg[6.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [1] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:38:45 compute-1 sudo[81079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:38:45 compute-1 sudo[81079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:45 compute-1 sudo[81079]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:45 compute-1 ceph-mon[75484]: pgmap v62: 6 pgs: 2 active+clean, 4 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4068993854' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4068993854' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Sep 30 17:38:45 compute-1 ceph-mon[75484]: osdmap e20: 2 total, 2 up, 2 in
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/474181382' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Sep 30 17:38:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e21 e21: 2 total, 2 up, 2 in
Sep 30 17:38:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/474181382' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Sep 30 17:38:47 compute-1 ceph-mon[75484]: osdmap e21: 2 total, 2 up, 2 in
Sep 30 17:38:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e22 e22: 2 total, 2 up, 2 in
Sep 30 17:38:48 compute-1 ceph-mon[75484]: pgmap v65: 7 pgs: 6 active+clean, 1 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:48 compute-1 ceph-mon[75484]: osdmap e22: 2 total, 2 up, 2 in
Sep 30 17:38:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/449678829' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Sep 30 17:38:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e23 e23: 2 total, 2 up, 2 in
Sep 30 17:38:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/449678829' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Sep 30 17:38:49 compute-1 ceph-mon[75484]: osdmap e23: 2 total, 2 up, 2 in
Sep 30 17:38:49 compute-1 ceph-mon[75484]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:38:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e23 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:50 compute-1 ceph-mon[75484]: pgmap v68: 7 pgs: 6 active+clean, 1 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3927088647' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Sep 30 17:38:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e24 e24: 2 total, 2 up, 2 in
Sep 30 17:38:50 compute-1 sudo[81135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:38:50 compute-1 sudo[81135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:50 compute-1 sudo[81135]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3927088647' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Sep 30 17:38:51 compute-1 ceph-mon[75484]: osdmap e24: 2 total, 2 up, 2 in
Sep 30 17:38:51 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:51 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3683221879' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Sep 30 17:38:51 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e25 e25: 2 total, 2 up, 2 in
Sep 30 17:38:52 compute-1 ceph-mon[75484]: pgmap v70: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3683221879' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Sep 30 17:38:52 compute-1 ceph-mon[75484]: osdmap e25: 2 total, 2 up, 2 in
Sep 30 17:38:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4246258036' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Sep 30 17:38:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e26 e26: 2 total, 2 up, 2 in
Sep 30 17:38:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4246258036' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Sep 30 17:38:53 compute-1 ceph-mon[75484]: osdmap e26: 2 total, 2 up, 2 in
Sep 30 17:38:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e27 e27: 2 total, 2 up, 2 in
Sep 30 17:38:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:38:54 compute-1 ceph-mon[75484]: pgmap v73: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1098035916' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Sep 30 17:38:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1098035916' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Sep 30 17:38:54 compute-1 ceph-mon[75484]: osdmap e27: 2 total, 2 up, 2 in
Sep 30 17:38:54 compute-1 ceph-mon[75484]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:38:55 compute-1 ceph-mon[75484]: pgmap v75: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:55 compute-1 ceph-mon[75484]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Sep 30 17:38:55 compute-1 ceph-mon[75484]: Cluster is now healthy
Sep 30 17:38:57 compute-1 sudo[81160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:38:57 compute-1 sudo[81160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:57 compute-1 sudo[81160]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:57 compute-1 sudo[81185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:38:57 compute-1 sudo[81185]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:38:57 compute-1 podman[81282]: 2025-09-30 17:38:57.669547201 +0000 UTC m=+0.052578993 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:38:57 compute-1 podman[81282]: 2025-09-30 17:38:57.786064034 +0000 UTC m=+0.169095836 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Sep 30 17:38:57 compute-1 ceph-mon[75484]: pgmap v76: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:38:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3634785233' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Sep 30 17:38:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3634785233' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Sep 30 17:38:58 compute-1 sudo[81185]: pam_unix(sudo:session): session closed for user root
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3768454336' entity='client.admin' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:38:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:00 compute-1 ceph-mon[75484]: pgmap v77: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:00 compute-1 ceph-mon[75484]: from='client.14272 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:00 compute-1 ceph-mon[75484]: Saving service rgw.rgw spec with placement compute-0;compute-1
Sep 30 17:39:00 compute-1 ceph-mon[75484]: Saving service ingress.rgw.default spec with placement count:2
Sep 30 17:39:01 compute-1 ceph-mon[75484]: pgmap v78: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='client.14276 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:01 compute-1 ceph-mon[75484]: Saving service node-exporter spec with placement *
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:01 compute-1 ceph-mon[75484]: Saving service grafana spec with placement compute-0;count:1
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:01 compute-1 ceph-mon[75484]: Saving service prometheus spec with placement compute-0;count:1
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:01 compute-1 ceph-mon[75484]: Saving service alertmanager spec with placement compute-0;count:1
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:01 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e28 e28: 2 total, 2 up, 2 in
Sep 30 17:39:02 compute-1 sudo[81369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:39:02 compute-1 sudo[81369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:02 compute-1 sudo[81369]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:02 compute-1 sudo[81394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:02 compute-1 sudo[81394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:02 compute-1 sudo[81394]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:02 compute-1 sudo[81419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:39:02 compute-1 sudo[81419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e29 e29: 2 total, 2 up, 2 in
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:02 compute-1 ceph-mon[75484]: osdmap e28: 2 total, 2 up, 2 in
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3197899215' entity='client.admin' 
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3474613208' entity='client.admin' 
Sep 30 17:39:03 compute-1 podman[81517]: 2025-09-30 17:39:03.349989321 +0000 UTC m=+0.089266192 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Sep 30 17:39:03 compute-1 podman[81517]: 2025-09-30 17:39:03.460282076 +0000 UTC m=+0.199558887 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Sep 30 17:39:03 compute-1 sudo[81419]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e29 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e30 e30: 2 total, 2 up, 2 in
Sep 30 17:39:04 compute-1 ceph-mon[75484]: pgmap v80: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:04 compute-1 ceph-mon[75484]: osdmap e29: 2 total, 2 up, 2 in
Sep 30 17:39:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2657039316' entity='client.admin' 
Sep 30 17:39:04 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: pgmap v82: 38 pgs: 1 peering, 31 unknown, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:05 compute-1 ceph-mon[75484]: osdmap e30: 2 total, 2 up, 2 in
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/252688255' entity='client.admin' 
Sep 30 17:39:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e31 e31: 2 total, 2 up, 2 in
Sep 30 17:39:06 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=14.851297379s) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active pruub 58.251918793s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:06 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=14.851297379s) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown pruub 58.251918793s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:06 compute-1 sudo[81627]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utguvkjwvahdwucarofwtehhdrerbqdm ; /usr/bin/python3'
Sep 30 17:39:06 compute-1 sudo[81627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:39:06 compute-1 python3[81629]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:39:06 compute-1 sudo[81627]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:07 compute-1 ceph-mon[75484]: 2.1c scrub starts
Sep 30 17:39:07 compute-1 ceph-mon[75484]: 2.1c scrub ok
Sep 30 17:39:07 compute-1 ceph-mon[75484]: 2.1f scrub starts
Sep 30 17:39:07 compute-1 ceph-mon[75484]: 2.1f scrub ok
Sep 30 17:39:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:07 compute-1 ceph-mon[75484]: osdmap e31: 2 total, 2 up, 2 in
Sep 30 17:39:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:07 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.19( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.18( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.16( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.15( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.17( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.14( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.13( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.12( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.11( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.10( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.e( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.d( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.c( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.b( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.a( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.f( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.7( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=32 pruub=14.758745193s) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active pruub 59.262397766s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=32 pruub=15.769513130s) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active pruub 60.273235321s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.6( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.5( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.2( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.3( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.4( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.8( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.9( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1a( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1b( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1d( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1c( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1e( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1f( empty local-lis/les=15/16 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=32 pruub=14.758745193s) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown pruub 59.262397766s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=32 pruub=15.769513130s) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown pruub 60.273235321s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.0( empty local-lis/les=31/32 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 32 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [1] r=0 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.18 deep-scrub starts
Sep 30 17:39:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.18 deep-scrub ok
Sep 30 17:39:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Sep 30 17:39:08 compute-1 ceph-mon[75484]: pgmap v85: 69 pgs: 1 peering, 62 unknown, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:08 compute-1 ceph-mon[75484]: 2.1d scrub starts
Sep 30 17:39:08 compute-1 ceph-mon[75484]: 2.1d scrub ok
Sep 30 17:39:08 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:08 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:08 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:08 compute-1 ceph-mon[75484]: osdmap e32: 2 total, 2 up, 2 in
Sep 30 17:39:08 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:39:08 compute-1 ceph-mon[75484]: 3.18 deep-scrub starts
Sep 30 17:39:08 compute-1 ceph-mon[75484]: 3.18 deep-scrub ok
Sep 30 17:39:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2268718532' entity='client.admin' 
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1f( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1e( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1f( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1e( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.11( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.10( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.10( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.11( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.12( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.12( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.13( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.13( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.15( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.14( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.14( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.15( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.17( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.16( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.16( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.17( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.9( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.8( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.8( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.9( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.b( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.a( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.a( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.b( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.d( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.c( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.c( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.6( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.d( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.7( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.3( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.2( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.7( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.6( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.4( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.5( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.5( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.4( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.2( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.3( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.e( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.f( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.e( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1c( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1d( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.f( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1d( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1a( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1c( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1b( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1b( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1a( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.18( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.19( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.19( empty local-lis/les=17/18 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.18( empty local-lis/les=16/17 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1e( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.10( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1e( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1f( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.11( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.12( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.13( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.17( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.14( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.15( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.16( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.14( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.17( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.8( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.9( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.a( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.c( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.c( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.6( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.0( empty local-lis/les=32/33 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.7( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.d( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.0( empty local-lis/les=32/33 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.3( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.2( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.6( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.5( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.4( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.b( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.5( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.3( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.f( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1d( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.e( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1c( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.1b( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.19( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.19( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[4.18( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=16/16 les/c/f=17/17/0 sis=32) [1] r=0 lpr=32 pi=[16,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 33 pg[5.1d( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=17/17 les/c/f=18/18/0 sis=32) [1] r=0 lpr=32 pi=[17,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Sep 30 17:39:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Sep 30 17:39:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Sep 30 17:39:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Sep 30 17:39:10 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Sep 30 17:39:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Sep 30 17:39:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Sep 30 17:39:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Sep 30 17:39:15 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.510864735s, txc = 0x556f3301b500
Sep 30 17:39:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Sep 30 17:39:15 compute-1 ceph-mon[75484]: 2.1b scrub starts
Sep 30 17:39:15 compute-1 ceph-mon[75484]: 2.1b scrub ok
Sep 30 17:39:15 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:39:15 compute-1 ceph-mon[75484]: osdmap e33: 2 total, 2 up, 2 in
Sep 30 17:39:15 compute-1 ceph-mon[75484]: 3.15 scrub starts
Sep 30 17:39:15 compute-1 ceph-mon[75484]: 3.15 scrub ok
Sep 30 17:39:15 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:15 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/98408612' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Sep 30 17:39:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Sep 30 17:39:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 34 pg[6.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34 pruub=9.501920700s) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active pruub 62.288440704s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Sep 30 17:39:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 34 pg[6.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34 pruub=9.501920700s) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown pruub 62.288440704s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Sep 30 17:39:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr handle_mgr_map respawning because set of enabled modules changed!
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  e: '/usr/bin/ceph-mgr'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  0: '/usr/bin/ceph-mgr'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  1: '-n'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  2: 'mgr.compute-1.glbusf'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  3: '-f'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  4: '--setuser'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  5: 'ceph'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  6: '--setgroup'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  7: 'ceph'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  8: '--default-log-to-file=false'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  9: '--default-log-to-journald=true'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  10: '--default-log-to-stderr=false'
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr respawn  exe_path /proc/self/exe
Sep 30 17:39:16 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e35 e35: 2 total, 2 up, 2 in
Sep 30 17:39:16 compute-1 ceph-mon[75484]: pgmap v88: 131 pgs: 1 peering, 124 unknown, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.a scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.a scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 3.16 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 3.16 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.9 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.9 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 3.19 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: pgmap v89: 131 pgs: 131 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.6 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.6 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.8 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.8 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: pgmap v90: 131 pgs: 131 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.7 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.7 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.1e deep-scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: osdmap e34: 2 total, 2 up, 2 in
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 2.1e deep-scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 3.19 scrub ok
Sep 30 17:39:16 compute-1 ceph-mon[75484]: 3.17 scrub starts
Sep 30 17:39:16 compute-1 ceph-mon[75484]: pgmap v92: 193 pgs: 62 unknown, 131 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/98408612' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:16 compute-1 ceph-mon[75484]: mgrmap e10: compute-0.efvthf(active, since 2m), standbys: compute-1.glbusf
Sep 30 17:39:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/577280954' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Sep 30 17:39:16 compute-1 sshd-session[72699]: Connection closed by 192.168.122.100 port 43740
Sep 30 17:39:16 compute-1 sshd-session[72498]: Connection closed by 192.168.122.100 port 43678
Sep 30 17:39:16 compute-1 sshd-session[72614]: Connection closed by 192.168.122.100 port 43716
Sep 30 17:39:16 compute-1 sshd-session[72672]: Connection closed by 192.168.122.100 port 43728
Sep 30 17:39:16 compute-1 sshd-session[72728]: Connection closed by 192.168.122.100 port 43748
Sep 30 17:39:16 compute-1 sshd-session[72527]: Connection closed by 192.168.122.100 port 43686
Sep 30 17:39:16 compute-1 sshd-session[72439]: Connection closed by 192.168.122.100 port 43640
Sep 30 17:39:16 compute-1 sshd-session[72469]: Connection closed by 192.168.122.100 port 43664
Sep 30 17:39:16 compute-1 sshd-session[72643]: Connection closed by 192.168.122.100 port 43722
Sep 30 17:39:16 compute-1 sshd-session[72440]: Connection closed by 192.168.122.100 port 43650
Sep 30 17:39:16 compute-1 sshd-session[72585]: Connection closed by 192.168.122.100 port 43714
Sep 30 17:39:16 compute-1 sshd-session[72556]: Connection closed by 192.168.122.100 port 43702
Sep 30 17:39:16 compute-1 sshd-session[72640]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 sshd-session[72696]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 sshd-session[72429]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 sshd-session[72553]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 sshd-session[72466]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 sshd-session[72725]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 sshd-session[72669]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 sshd-session[72611]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 systemd[1]: session-33.scope: Consumed 1min 8.438s CPU time.
Sep 30 17:39:16 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 sshd-session[72495]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 27 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 24 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 30 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.18( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.571019173s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.610008240s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.18( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570968628s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.610008240s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570459366s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609642029s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570419312s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609642029s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1b( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.561338425s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600799561s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570161819s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609649658s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.561317444s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600799561s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 33 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570138931s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609649658s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.18( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570113182s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609786987s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 25 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.570093155s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609786987s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.561097145s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600952148s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1b( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569800377s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609664917s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.561076164s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600952148s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1b( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569776535s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609664917s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.19( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569576263s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609619141s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569461823s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609619141s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1e( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 23 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1f( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.560160637s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600585938s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.560134888s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600585938s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568977356s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609466553s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568956375s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609466553s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.e( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569030762s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609657288s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.560141563s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600791931s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.e( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.569008827s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609657288s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.560122490s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600791931s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 29 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 31 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.c( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568645477s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609481812s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568624496s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609481812s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.d( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568283081s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609367371s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.568261147s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609367371s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567934036s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609260559s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567913055s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609260559s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.558706284s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600326538s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.558683395s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600326538s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.5( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567421913s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609321594s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.5( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567399025s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609321594s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567291260s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609252930s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.567269325s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609252930s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.6( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.7( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.4( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566841125s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609191895s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566791534s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609191895s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.557697296s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600151062s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566636086s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.609153748s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.557656288s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600151062s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566618919s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.609153748s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.3( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 32 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566084862s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608901978s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.2( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.566064835s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608901978s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.5( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.d( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.565700531s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608924866s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.d( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.565673828s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608924866s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.f( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.c( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.565211296s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608627319s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.c( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.565189362s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608627319s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.e( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556656837s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600006104s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556459427s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600006104s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.9( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556047440s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599815369s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 sshd-session[72582]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556021690s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599815369s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.564799309s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608627319s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.555809021s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599678040s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.a( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.564766884s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608627319s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.555783272s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599678040s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.555558205s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599700928s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.b( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.555540085s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599700928s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.8( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.564016342s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608322144s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556052208s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.600379944s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.8( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.563995361s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608322144s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.556030273s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.600379944s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.563932419s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608428955s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.563910484s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608428955s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.8( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.a( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.554189682s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599693298s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.554169655s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599693298s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.9( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.561868668s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608421326s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.9( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.561831474s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608421326s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.561516762s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.608177185s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.561496735s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.608177185s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.15( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.14( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552731514s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599617004s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552710533s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599617004s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.17( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.15( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560948372s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607940674s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.15( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560926437s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607940674s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552369118s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599502563s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552347183s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599502563s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560671806s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607902527s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560649872s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607902527s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.16( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.13( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560523033s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607940674s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552032471s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599479675s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.13( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.560499191s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607940674s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.552009583s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599479675s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.11( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.551610947s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599235535s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.10( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.551585197s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599235535s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559574127s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607429504s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559553146s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607429504s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.13( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559057236s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607276917s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.12( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559032440s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607276917s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 sshd-session[72416]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1f( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559081078s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.607490540s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[4.1f( empty local-lis/les=32/33 n=0 ec=32/16 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.559060097s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.607490540s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.550704956s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 active pruub 68.599098206s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/15 lis/c=31/31 les/c/f=32/32/0 sis=35 pruub=14.550539970s) [0] r=-1 lpr=35 pi=[31,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 68.599098206s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.497525215s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 69.546150208s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/17 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=15.497505188s) [0] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 69.546150208s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1c( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1a( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[6.1d( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 24.
Sep 30 17:39:16 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setuser ceph since I am not root
Sep 30 17:39:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setgroup ceph since I am not root
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.19( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 33.
Sep 30 17:39:16 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 28 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 21 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 32.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 31.
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: pidfile_write: ignore empty --pid-file
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 30.
Sep 30 17:39:16 compute-1 sshd-session[72524]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 29.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 27.
Sep 30 17:39:16 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Sep 30 17:39:16 compute-1 systemd-logind[789]: Session 26 logged out. Waiting for processes to exit.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.15( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 23.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.13( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.10( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.d( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 25.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.c( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.1( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 28.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.4( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.6( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.a( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.1b( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 21.
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.1e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.9( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 35 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:16 compute-1 systemd-logind[789]: Removed session 26.
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'alerts'
Sep 30 17:39:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:16.995+0000 7f0aabb7e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:16 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'balancer'
Sep 30 17:39:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:17.074+0000 7f0aabb7e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:17 compute-1 ceph-mgr[75792]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:17 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'cephadm'
Sep 30 17:39:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Sep 30 17:39:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 2.4 scrub starts
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 2.4 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 3.17 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 3.14 scrub starts
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 3.14 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 2.2 scrub starts
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 2.2 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 3.11 scrub starts
Sep 30 17:39:17 compute-1 ceph-mon[75484]: 3.11 scrub ok
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/577280954' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Sep 30 17:39:17 compute-1 ceph-mon[75484]: from='mgr.14120 192.168.122.100:0/490895231' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:17 compute-1 ceph-mon[75484]: osdmap e35: 2 total, 2 up, 2 in
Sep 30 17:39:17 compute-1 ceph-mon[75484]: mgrmap e11: compute-0.efvthf(active, since 2m), standbys: compute-1.glbusf
Sep 30 17:39:17 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'crash'
Sep 30 17:39:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e36 e36: 2 total, 2 up, 2 in
Sep 30 17:39:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:17.901+0000 7f0aabb7e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:17 compute-1 ceph-mgr[75792]: mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:17 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'dashboard'
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.15( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.19( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.10( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.e( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.d( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.c( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.13( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.1( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.6( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.4( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.a( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.1f( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.9( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.1e( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.18( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1b( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.19( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1f( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.c( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.d( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.6( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.7( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.4( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.0( empty local-lis/les=34/36 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.2( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.5( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.f( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.9( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.3( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.8( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.15( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.b( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.14( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.16( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.17( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.11( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.13( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.12( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.10( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1d( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[6.1c( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 36 pg[2.1b( empty local-lis/les=35/36 n=0 ec=29/14 lis/c=29/29 les/c/f=30/30/0 sis=35) [1] r=0 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'devicehealth'
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:18.545+0000 7f0aabb7e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'diskprediction_local'
Sep 30 17:39:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Sep 30 17:39:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]:   from numpy import show_config as show_numpy_config
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:18.709+0000 7f0aabb7e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'influx'
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:18.776+0000 7f0aabb7e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'insights'
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'iostat'
Sep 30 17:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:18.906+0000 7f0aabb7e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:18 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'k8sevents'
Sep 30 17:39:18 compute-1 ceph-mon[75484]: 2.1a scrub starts
Sep 30 17:39:18 compute-1 ceph-mon[75484]: 2.1a scrub ok
Sep 30 17:39:18 compute-1 ceph-mon[75484]: 5.19 scrub starts
Sep 30 17:39:18 compute-1 ceph-mon[75484]: 5.19 scrub ok
Sep 30 17:39:18 compute-1 ceph-mon[75484]: osdmap e36: 2 total, 2 up, 2 in
Sep 30 17:39:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'localpool'
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mds_autoscaler'
Sep 30 17:39:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1f deep-scrub starts
Sep 30 17:39:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1f deep-scrub ok
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mirroring'
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'nfs'
Sep 30 17:39:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:19.838+0000 7f0aabb7e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:19 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'orchestrator'
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.18 scrub starts
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.18 scrub ok
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 3.1e scrub starts
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 3.1e scrub ok
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.17 scrub starts
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.17 scrub ok
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 3.1f deep-scrub starts
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 3.1f deep-scrub ok
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.16 deep-scrub starts
Sep 30 17:39:19 compute-1 ceph-mon[75484]: 2.16 deep-scrub ok
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.041+0000 7f0aabb7e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_perf_query'
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.113+0000 7f0aabb7e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_support'
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.173+0000 7f0aabb7e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'pg_autoscaler'
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.243+0000 7f0aabb7e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'progress'
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.313+0000 7f0aabb7e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'prometheus'
Sep 30 17:39:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.19 deep-scrub starts
Sep 30 17:39:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.19 deep-scrub ok
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.628+0000 7f0aabb7e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rbd_support'
Sep 30 17:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:20.718+0000 7f0aabb7e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'restful'
Sep 30 17:39:20 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rgw'
Sep 30 17:39:20 compute-1 ceph-mon[75484]: 4.19 deep-scrub starts
Sep 30 17:39:20 compute-1 ceph-mon[75484]: 4.19 deep-scrub ok
Sep 30 17:39:20 compute-1 ceph-mon[75484]: 2.14 scrub starts
Sep 30 17:39:20 compute-1 ceph-mon[75484]: 2.14 scrub ok
Sep 30 17:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:21.147+0000 7f0aabb7e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rook'
Sep 30 17:39:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Sep 30 17:39:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Sep 30 17:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:21.696+0000 7f0aabb7e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'selftest'
Sep 30 17:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:21.764+0000 7f0aabb7e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'snap_schedule'
Sep 30 17:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:21.842+0000 7f0aabb7e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'stats'
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'status'
Sep 30 17:39:21 compute-1 ceph-mon[75484]: 3.1b scrub starts
Sep 30 17:39:21 compute-1 ceph-mon[75484]: 3.1b scrub ok
Sep 30 17:39:21 compute-1 ceph-mon[75484]: 2.12 scrub starts
Sep 30 17:39:21 compute-1 ceph-mon[75484]: 2.12 scrub ok
Sep 30 17:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:21.980+0000 7f0aabb7e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:21 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telegraf'
Sep 30 17:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:22.049+0000 7f0aabb7e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telemetry'
Sep 30 17:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:22.205+0000 7f0aabb7e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'test_orchestrator'
Sep 30 17:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:22.418+0000 7f0aabb7e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'volumes'
Sep 30 17:39:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1c deep-scrub starts
Sep 30 17:39:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1c deep-scrub ok
Sep 30 17:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:22.683+0000 7f0aabb7e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'zabbix'
Sep 30 17:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:22.750+0000 7f0aabb7e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: mgr load Constructed class from module: dashboard
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: [dashboard INFO root] Configured CherryPy, starting engine...
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: [dashboard INFO root] Starting engine...
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: ms_deliver_dispatch: unhandled message 0x555d8ec7e340 mon_map magic: 0 from mon.1 v2:192.168.122.101:3300/0
Sep 30 17:39:22 compute-1 ceph-mgr[75792]: [dashboard INFO root] Engine started...
Sep 30 17:39:22 compute-1 ceph-mon[75484]: 4.1c deep-scrub starts
Sep 30 17:39:22 compute-1 ceph-mon[75484]: 4.1c deep-scrub ok
Sep 30 17:39:22 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf restarted
Sep 30 17:39:22 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf started
Sep 30 17:39:22 compute-1 ceph-mon[75484]: 2.11 scrub starts
Sep 30 17:39:22 compute-1 ceph-mon[75484]: 2.11 scrub ok
Sep 30 17:39:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e37 e37: 2 total, 2 up, 2 in
Sep 30 17:39:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Sep 30 17:39:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Sep 30 17:39:23 compute-1 sshd-session[81686]: Accepted publickey for ceph-admin from 192.168.122.100 port 52446 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:39:23 compute-1 systemd-logind[789]: New session 34 of user ceph-admin.
Sep 30 17:39:23 compute-1 systemd[1]: Started Session 34 of User ceph-admin.
Sep 30 17:39:23 compute-1 sshd-session[81686]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:39:23 compute-1 sudo[81690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:23 compute-1 sudo[81690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:23 compute-1 sudo[81690]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:23 compute-1 sudo[81715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:39:23 compute-1 sudo[81715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:23 compute-1 ceph-mon[75484]: mgrmap e12: compute-0.efvthf(active, since 2m), standbys: compute-1.glbusf
Sep 30 17:39:23 compute-1 ceph-mon[75484]: Active manager daemon compute-0.efvthf restarted
Sep 30 17:39:23 compute-1 ceph-mon[75484]: Activating manager daemon compute-0.efvthf
Sep 30 17:39:23 compute-1 ceph-mon[75484]: osdmap e37: 2 total, 2 up, 2 in
Sep 30 17:39:23 compute-1 ceph-mon[75484]: mgrmap e13: compute-0.efvthf(active, starting, since 0.019525s), standbys: compute-1.glbusf
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-0.efvthf", "id": "compute-0.efvthf"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-1.glbusf", "id": "compute-1.glbusf"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: Manager daemon compute-0.efvthf is now available
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/mirror_snapshot_schedule"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/trash_purge_schedule"}]: dispatch
Sep 30 17:39:23 compute-1 ceph-mon[75484]: 5.1d scrub starts
Sep 30 17:39:23 compute-1 ceph-mon[75484]: 5.1d scrub ok
Sep 30 17:39:23 compute-1 ceph-mon[75484]: 2.f deep-scrub starts
Sep 30 17:39:23 compute-1 ceph-mon[75484]: 2.f deep-scrub ok
Sep 30 17:39:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1d deep-scrub starts
Sep 30 17:39:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1d deep-scrub ok
Sep 30 17:39:24 compute-1 podman[81814]: 2025-09-30 17:39:24.552308617 +0000 UTC m=+0.052457330 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Sep 30 17:39:24 compute-1 podman[81814]: 2025-09-30 17:39:24.637575689 +0000 UTC m=+0.137724422 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:39:25 compute-1 sudo[81715]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:25 compute-1 sudo[81901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:25 compute-1 sudo[81901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:25 compute-1 sudo[81901]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:25 compute-1 sudo[81926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:39:25 compute-1 sudo[81926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:25 compute-1 ceph-mon[75484]: mgrmap e14: compute-0.efvthf(active, since 1.03711s), standbys: compute-1.glbusf
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: 4.1d deep-scrub starts
Sep 30 17:39:25 compute-1 ceph-mon[75484]: 4.1d deep-scrub ok
Sep 30 17:39:25 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:24] ENGINE Bus STARTING
Sep 30 17:39:25 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:24] ENGINE Serving on http://192.168.122.100:8765
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:24] ENGINE Serving on https://192.168.122.100:7150
Sep 30 17:39:25 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:24] ENGINE Bus STARTED
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:24] ENGINE Client ('192.168.122.100', 34486) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Sep 30 17:39:25 compute-1 ceph-mon[75484]: 2.b scrub starts
Sep 30 17:39:25 compute-1 ceph-mon[75484]: 2.b scrub ok
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:25 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:39:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.f scrub starts
Sep 30 17:39:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.f scrub ok
Sep 30 17:39:25 compute-1 sudo[81926]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:25 compute-1 sudo[81983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:25 compute-1 sudo[81983]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:25 compute-1 sudo[81983]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 17:39:26 compute-1 sudo[82008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e38 e38: 2 total, 2 up, 2 in
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688711166s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.169685364s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688570023s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169685364s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688528061s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.169692993s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.19( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688467026s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.169662476s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688481331s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169692993s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.19( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688415527s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169662476s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.d( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688345909s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.169754028s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.d( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688324928s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169754028s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.7( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688423157s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.169952393s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.7( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688396454s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169952393s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.2( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688424110s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170120239s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.2( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688405991s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170120239s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.3( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688389778s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170158386s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.5( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688360214s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170158386s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.3( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688339233s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170158386s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.5( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688334465s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170158386s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688281059s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170173645s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.e( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688266754s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170173645s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.8( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688674927s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170639038s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.8( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688660622s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170639038s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.15( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688631058s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170745850s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.15( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688615799s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170745850s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688560486s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170715332s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.a( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688543320s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170715332s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.17( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688682556s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.170936584s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.17( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688665390s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170936584s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.12( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688595772s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.171043396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.12( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688574791s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.171043396s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1c( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688563347s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active pruub 79.171112061s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[6.1c( empty local-lis/les=34/36 n=0 ec=34/19 lis/c=34/34 les/c/f=36/36/0 sis=38 pruub=15.688549042s) [0] r=-1 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.171112061s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.1d( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.13( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.10( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.14( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.a( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.b( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.8( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.9( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.e( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.6( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.4( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.3( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.2( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.1e( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.f( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.18( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 38 pg[7.1b( empty local-lis/les=0/0 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='client.14330 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:26 compute-1 ceph-mon[75484]: pgmap v4: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:26 compute-1 ceph-mon[75484]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Sep 30 17:39:26 compute-1 ceph-mon[75484]: 4.f scrub starts
Sep 30 17:39:26 compute-1 ceph-mon[75484]: 4.f scrub ok
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Sep 30 17:39:26 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-0 to 127.8M
Sep 30 17:39:26 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-0 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:39:26 compute-1 ceph-mon[75484]: 2.3 deep-scrub starts
Sep 30 17:39:26 compute-1 ceph-mon[75484]: 2.3 deep-scrub ok
Sep 30 17:39:26 compute-1 ceph-mon[75484]: mgrmap e15: compute-0.efvthf(active, since 2s), standbys: compute-1.glbusf
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:26 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:39:26 compute-1 ceph-mon[75484]: osdmap e38: 2 total, 2 up, 2 in
Sep 30 17:39:26 compute-1 sudo[82008]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:39:26 compute-1 sudo[82051]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82051]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:39:26 compute-1 sudo[82076]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82076]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Sep 30 17:39:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Sep 30 17:39:26 compute-1 sudo[82101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:26 compute-1 sudo[82101]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82101]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:26 compute-1 sudo[82126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82126]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:26 compute-1 sudo[82151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82151]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:26 compute-1 sudo[82199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82199]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:26 compute-1 sudo[82224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:26 compute-1 sudo[82249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Sep 30 17:39:26 compute-1 sudo[82249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:26 compute-1 sudo[82249]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:27 compute-1 sudo[82274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82274]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:27 compute-1 sudo[82299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82299]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:27 compute-1 sudo[82324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82324]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:27 compute-1 sudo[82349]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82349]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:27 compute-1 sudo[82374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82374]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e39 e39: 2 total, 2 up, 2 in
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.1b( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.18( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.1e( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.6( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.4( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.2( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.e( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.3( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.f( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.8( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.9( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.14( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.a( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.b( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.10( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.1d( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 39 pg[7.13( empty local-lis/les=38/39 n=0 ec=34/20 lis/c=34/34 les/c/f=36/36/0 sis=38) [1] r=0 lpr=38 pi=[34,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='client.14338 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:39:27 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-1 to 127.8M
Sep 30 17:39:27 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-1 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:39:27 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.conf
Sep 30 17:39:27 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.conf
Sep 30 17:39:27 compute-1 ceph-mon[75484]: 3.8 scrub starts
Sep 30 17:39:27 compute-1 ceph-mon[75484]: 3.8 scrub ok
Sep 30 17:39:27 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:27 compute-1 ceph-mon[75484]: osdmap e39: 2 total, 2 up, 2 in
Sep 30 17:39:27 compute-1 sudo[82422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:27 compute-1 sudo[82422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82422]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Sep 30 17:39:27 compute-1 sudo[82447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:27 compute-1 sudo[82447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Sep 30 17:39:27 compute-1 sudo[82447]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:27 compute-1 sudo[82472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82472]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:39:27 compute-1 sudo[82497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82497]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:39:27 compute-1 sudo[82522]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82522]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:27 compute-1 sudo[82547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:27 compute-1 sudo[82547]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:27 compute-1 sudo[82547]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:28 compute-1 sudo[82572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82572]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:28 compute-1 sudo[82597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82597]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:28 compute-1 sudo[82645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82645]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:28 compute-1 sudo[82670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82670]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:28 compute-1 ceph-mon[75484]: 2.0 scrub starts
Sep 30 17:39:28 compute-1 ceph-mon[75484]: 2.0 scrub ok
Sep 30 17:39:28 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:28 compute-1 ceph-mon[75484]: from='client.14342 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:28 compute-1 ceph-mon[75484]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:28 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:28 compute-1 ceph-mon[75484]: 4.3 scrub starts
Sep 30 17:39:28 compute-1 ceph-mon[75484]: 4.3 scrub ok
Sep 30 17:39:28 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:28 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:28 compute-1 ceph-mon[75484]: mgrmap e16: compute-0.efvthf(active, since 4s), standbys: compute-1.glbusf
Sep 30 17:39:28 compute-1 sudo[82695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:28 compute-1 sudo[82695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82695]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Sep 30 17:39:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Sep 30 17:39:28 compute-1 sudo[82720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:28 compute-1 sudo[82720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82720]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:28 compute-1 sudo[82745]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82745]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:28 compute-1 sudo[82770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82770]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:28 compute-1 sudo[82795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:28 compute-1 sudo[82795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:28 compute-1 sudo[82795]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:29 compute-1 sudo[82820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:29 compute-1 sudo[82820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:29 compute-1 sudo[82820]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:29 compute-1 sudo[82868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:29 compute-1 sudo[82868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:29 compute-1 sudo[82868]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:29 compute-1 sudo[82893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:29 compute-1 sudo[82893]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:29 compute-1 sudo[82893]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:29 compute-1 sudo[82918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:29 compute-1 sudo[82918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:29 compute-1 sudo[82918]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:29 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 2.5 scrub starts
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 2.5 scrub ok
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='client.14346 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:29 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 3.4 scrub starts
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 3.4 scrub ok
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 4.1f scrub starts
Sep 30 17:39:29 compute-1 ceph-mon[75484]: 4.1f scrub ok
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2417511691' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:29 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr handle_mgr_map respawning because set of enabled modules changed!
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  e: '/usr/bin/ceph-mgr'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  0: '/usr/bin/ceph-mgr'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  1: '-n'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  2: 'mgr.compute-1.glbusf'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  3: '-f'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  4: '--setuser'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  5: 'ceph'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  6: '--setgroup'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  7: 'ceph'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  8: '--default-log-to-file=false'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  9: '--default-log-to-journald=true'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  10: '--default-log-to-stderr=false'
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr respawn  exe_path /proc/self/exe
Sep 30 17:39:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Sep 30 17:39:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Sep 30 17:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setuser ceph since I am not root
Sep 30 17:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setgroup ceph since I am not root
Sep 30 17:39:29 compute-1 sshd-session[81689]: Connection closed by 192.168.122.100 port 52446
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: pidfile_write: ignore empty --pid-file
Sep 30 17:39:29 compute-1 sshd-session[81686]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:39:29 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Sep 30 17:39:29 compute-1 systemd[1]: session-34.scope: Consumed 5.907s CPU time.
Sep 30 17:39:29 compute-1 systemd-logind[789]: Session 34 logged out. Waiting for processes to exit.
Sep 30 17:39:29 compute-1 systemd-logind[789]: Removed session 34.
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'alerts'
Sep 30 17:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:29.751+0000 7f25c9a9c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'balancer'
Sep 30 17:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:29.826+0000 7f25c9a9c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:29 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'cephadm'
Sep 30 17:39:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Sep 30 17:39:30 compute-1 ceph-mon[75484]: from='mgr.14308 192.168.122.100:0/32643731' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2417511691' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Sep 30 17:39:30 compute-1 ceph-mon[75484]: mgrmap e17: compute-0.efvthf(active, since 6s), standbys: compute-1.glbusf
Sep 30 17:39:30 compute-1 ceph-mon[75484]: 4.4 scrub starts
Sep 30 17:39:30 compute-1 ceph-mon[75484]: 4.4 scrub ok
Sep 30 17:39:30 compute-1 ceph-mon[75484]: 5.11 scrub starts
Sep 30 17:39:30 compute-1 ceph-mon[75484]: 5.11 scrub ok
Sep 30 17:39:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1732324391' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Sep 30 17:39:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Sep 30 17:39:30 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'crash'
Sep 30 17:39:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:30.696+0000 7f25c9a9c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:30 compute-1 ceph-mgr[75792]: mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:30 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'dashboard'
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'devicehealth'
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:31.303+0000 7f25c9a9c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'diskprediction_local'
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]:   from numpy import show_config as show_numpy_config
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:31.465+0000 7f25c9a9c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'influx'
Sep 30 17:39:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:31.535+0000 7f25c9a9c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'insights'
Sep 30 17:39:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Sep 30 17:39:31 compute-1 ceph-mon[75484]: 5.5 scrub starts
Sep 30 17:39:31 compute-1 ceph-mon[75484]: 5.5 scrub ok
Sep 30 17:39:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1732324391' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Sep 30 17:39:31 compute-1 ceph-mon[75484]: mgrmap e18: compute-0.efvthf(active, since 7s), standbys: compute-1.glbusf
Sep 30 17:39:31 compute-1 ceph-mon[75484]: 5.1f scrub starts
Sep 30 17:39:31 compute-1 ceph-mon[75484]: 5.1f scrub ok
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'iostat'
Sep 30 17:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:31.686+0000 7f25c9a9c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:31 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'k8sevents'
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'localpool'
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mds_autoscaler'
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mirroring'
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'nfs'
Sep 30 17:39:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Sep 30 17:39:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Sep 30 17:39:32 compute-1 ceph-mon[75484]: 3.2 scrub starts
Sep 30 17:39:32 compute-1 ceph-mon[75484]: 3.2 scrub ok
Sep 30 17:39:32 compute-1 ceph-mon[75484]: 4.15 scrub starts
Sep 30 17:39:32 compute-1 ceph-mon[75484]: 4.15 scrub ok
Sep 30 17:39:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:32.708+0000 7f25c9a9c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'orchestrator'
Sep 30 17:39:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:32.931+0000 7f25c9a9c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:32 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_perf_query'
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.006+0000 7f25c9a9c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_support'
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.077+0000 7f25c9a9c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'pg_autoscaler'
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.157+0000 7f25c9a9c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'progress'
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.227+0000 7f25c9a9c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'prometheus'
Sep 30 17:39:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.581+0000 7f25c9a9c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rbd_support'
Sep 30 17:39:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Sep 30 17:39:33 compute-1 ceph-mon[75484]: 3.1 scrub starts
Sep 30 17:39:33 compute-1 ceph-mon[75484]: 3.1 scrub ok
Sep 30 17:39:33 compute-1 ceph-mon[75484]: 3.13 scrub starts
Sep 30 17:39:33 compute-1 ceph-mon[75484]: 3.13 scrub ok
Sep 30 17:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:33.674+0000 7f25c9a9c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'restful'
Sep 30 17:39:33 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rgw'
Sep 30 17:39:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:34.091+0000 7f25c9a9c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rook'
Sep 30 17:39:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Sep 30 17:39:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Sep 30 17:39:34 compute-1 ceph-mon[75484]: 4.6 scrub starts
Sep 30 17:39:34 compute-1 ceph-mon[75484]: 4.6 scrub ok
Sep 30 17:39:34 compute-1 ceph-mon[75484]: 5.10 scrub starts
Sep 30 17:39:34 compute-1 ceph-mon[75484]: 5.10 scrub ok
Sep 30 17:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:34.710+0000 7f25c9a9c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'selftest'
Sep 30 17:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:34.785+0000 7f25c9a9c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'snap_schedule'
Sep 30 17:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:34.865+0000 7f25c9a9c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'stats'
Sep 30 17:39:34 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'status'
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.012+0000 7f25c9a9c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telegraf'
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.081+0000 7f25c9a9c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telemetry'
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.235+0000 7f25c9a9c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'test_orchestrator'
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.460+0000 7f25c9a9c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'volumes'
Sep 30 17:39:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Sep 30 17:39:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Sep 30 17:39:35 compute-1 ceph-mon[75484]: 4.2 scrub starts
Sep 30 17:39:35 compute-1 ceph-mon[75484]: 4.2 scrub ok
Sep 30 17:39:35 compute-1 ceph-mon[75484]: 5.16 scrub starts
Sep 30 17:39:35 compute-1 ceph-mon[75484]: 5.16 scrub ok
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.714+0000 7f25c9a9c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'zabbix'
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:35.788+0000 7f25c9a9c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: ms_deliver_dispatch: unhandled message 0x563d4f23e340 mon_map magic: 0 from mon.1 v2:192.168.122.101:3300/0
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr handle_mgr_map respawning because set of enabled modules changed!
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  e: '/usr/bin/ceph-mgr'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  0: '/usr/bin/ceph-mgr'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  1: '-n'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  2: 'mgr.compute-1.glbusf'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  3: '-f'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  4: '--setuser'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  5: 'ceph'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  6: '--setgroup'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  7: 'ceph'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  8: '--default-log-to-file=false'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  9: '--default-log-to-journald=true'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  10: '--default-log-to-stderr=false'
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr respawn  exe_path /proc/self/exe
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setuser ceph since I am not root
Sep 30 17:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setgroup ceph since I am not root
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: pidfile_write: ignore empty --pid-file
Sep 30 17:39:35 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'alerts'
Sep 30 17:39:35 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e40 e40: 2 total, 2 up, 2 in
Sep 30 17:39:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:36.040+0000 7f856caa5140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'balancer'
Sep 30 17:39:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:36.119+0000 7f856caa5140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'cephadm'
Sep 30 17:39:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Sep 30 17:39:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Sep 30 17:39:36 compute-1 ceph-mon[75484]: 5.3 scrub starts
Sep 30 17:39:36 compute-1 ceph-mon[75484]: 5.3 scrub ok
Sep 30 17:39:36 compute-1 ceph-mon[75484]: 3.10 scrub starts
Sep 30 17:39:36 compute-1 ceph-mon[75484]: 3.10 scrub ok
Sep 30 17:39:36 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf restarted
Sep 30 17:39:36 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf started
Sep 30 17:39:36 compute-1 ceph-mon[75484]: Active manager daemon compute-0.efvthf restarted
Sep 30 17:39:36 compute-1 ceph-mon[75484]: Activating manager daemon compute-0.efvthf
Sep 30 17:39:36 compute-1 ceph-mon[75484]: osdmap e40: 2 total, 2 up, 2 in
Sep 30 17:39:36 compute-1 ceph-mon[75484]: mgrmap e19: compute-0.efvthf(active, starting, since 0.0221987s), standbys: compute-1.glbusf
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'crash'
Sep 30 17:39:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:36.915+0000 7f856caa5140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:39:36 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'dashboard'
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'devicehealth'
Sep 30 17:39:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Sep 30 17:39:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:37.539+0000 7f856caa5140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'diskprediction_local'
Sep 30 17:39:37 compute-1 ceph-mon[75484]: 3.6 scrub starts
Sep 30 17:39:37 compute-1 ceph-mon[75484]: 3.6 scrub ok
Sep 30 17:39:37 compute-1 ceph-mon[75484]: 3.e scrub starts
Sep 30 17:39:37 compute-1 ceph-mon[75484]: 3.e scrub ok
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]:   from numpy import show_config as show_numpy_config
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:37.703+0000 7f856caa5140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'influx'
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:37.772+0000 7f856caa5140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'insights'
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'iostat'
Sep 30 17:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:37.904+0000 7f856caa5140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:39:37 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'k8sevents'
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'localpool'
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mds_autoscaler'
Sep 30 17:39:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Sep 30 17:39:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mirroring'
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'nfs'
Sep 30 17:39:38 compute-1 ceph-mon[75484]: 5.0 scrub starts
Sep 30 17:39:38 compute-1 ceph-mon[75484]: 5.0 scrub ok
Sep 30 17:39:38 compute-1 ceph-mon[75484]: 4.8 scrub starts
Sep 30 17:39:38 compute-1 ceph-mon[75484]: 4.8 scrub ok
Sep 30 17:39:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:38.827+0000 7f856caa5140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:39:38 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'orchestrator'
Sep 30 17:39:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.063+0000 7f856caa5140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_perf_query'
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.140+0000 7f856caa5140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_support'
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.209+0000 7f856caa5140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'pg_autoscaler'
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.295+0000 7f856caa5140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'progress'
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.369+0000 7f856caa5140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'prometheus'
Sep 30 17:39:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Sep 30 17:39:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Sep 30 17:39:39 compute-1 systemd[1]: Stopping User Manager for UID 42477...
Sep 30 17:39:39 compute-1 systemd[72420]: Activating special unit Exit the Session...
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped target Main User Target.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped target Basic System.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped target Paths.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped target Sockets.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped target Timers.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 17:39:39 compute-1 systemd[72420]: Closed D-Bus User Message Bus Socket.
Sep 30 17:39:39 compute-1 systemd[72420]: Stopped Create User's Volatile Files and Directories.
Sep 30 17:39:39 compute-1 systemd[72420]: Removed slice User Application Slice.
Sep 30 17:39:39 compute-1 systemd[72420]: Reached target Shutdown.
Sep 30 17:39:39 compute-1 systemd[72420]: Finished Exit the Session.
Sep 30 17:39:39 compute-1 systemd[72420]: Reached target Exit the Session.
Sep 30 17:39:39 compute-1 systemd[1]: user@42477.service: Deactivated successfully.
Sep 30 17:39:39 compute-1 systemd[1]: Stopped User Manager for UID 42477.
Sep 30 17:39:39 compute-1 ceph-mon[75484]: 3.7 scrub starts
Sep 30 17:39:39 compute-1 ceph-mon[75484]: 3.7 scrub ok
Sep 30 17:39:39 compute-1 ceph-mon[75484]: 5.9 scrub starts
Sep 30 17:39:39 compute-1 ceph-mon[75484]: 5.9 scrub ok
Sep 30 17:39:39 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.724+0000 7f856caa5140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rbd_support'
Sep 30 17:39:39 compute-1 systemd[1]: run-user-42477.mount: Deactivated successfully.
Sep 30 17:39:39 compute-1 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Sep 30 17:39:39 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Sep 30 17:39:39 compute-1 systemd[1]: Removed slice User Slice of UID 42477.
Sep 30 17:39:39 compute-1 systemd[1]: user-42477.slice: Consumed 1min 15.975s CPU time.
Sep 30 17:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:39.856+0000 7f856caa5140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:39:39 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'restful'
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rgw'
Sep 30 17:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:40.275+0000 7f856caa5140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rook'
Sep 30 17:39:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Sep 30 17:39:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Sep 30 17:39:40 compute-1 ceph-mon[75484]: 4.0 scrub starts
Sep 30 17:39:40 compute-1 ceph-mon[75484]: 4.0 scrub ok
Sep 30 17:39:40 compute-1 ceph-mon[75484]: 4.9 scrub starts
Sep 30 17:39:40 compute-1 ceph-mon[75484]: 4.9 scrub ok
Sep 30 17:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:40.816+0000 7f856caa5140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'selftest'
Sep 30 17:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:40.884+0000 7f856caa5140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'snap_schedule'
Sep 30 17:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:40.964+0000 7f856caa5140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:39:40 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'stats'
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'status'
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.116+0000 7f856caa5140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telegraf'
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.184+0000 7f856caa5140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telemetry'
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.335+0000 7f856caa5140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'test_orchestrator'
Sep 30 17:39:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.540+0000 7f856caa5140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'volumes'
Sep 30 17:39:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Sep 30 17:39:41 compute-1 ceph-mon[75484]: 4.7 scrub starts
Sep 30 17:39:41 compute-1 ceph-mon[75484]: 4.7 scrub ok
Sep 30 17:39:41 compute-1 ceph-mon[75484]: 4.13 scrub starts
Sep 30 17:39:41 compute-1 ceph-mon[75484]: 4.13 scrub ok
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.799+0000 7f856caa5140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'zabbix'
Sep 30 17:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:39:41.869+0000 7f856caa5140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: mgr load Constructed class from module: dashboard
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: [dashboard INFO root] Configured CherryPy, starting engine...
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: [dashboard INFO root] Starting engine...
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: ms_deliver_dispatch: unhandled message 0x558d74f40340 mon_map magic: 0 from mon.1 v2:192.168.122.101:3300/0
Sep 30 17:39:41 compute-1 ceph-mgr[75792]: [dashboard INFO root] Engine started...
Sep 30 17:39:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e41 e41: 2 total, 2 up, 2 in
Sep 30 17:39:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Sep 30 17:39:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Sep 30 17:39:42 compute-1 ceph-mon[75484]: 3.0 scrub starts
Sep 30 17:39:42 compute-1 ceph-mon[75484]: 3.0 scrub ok
Sep 30 17:39:42 compute-1 ceph-mon[75484]: 3.f scrub starts
Sep 30 17:39:42 compute-1 ceph-mon[75484]: 3.f scrub ok
Sep 30 17:39:42 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf restarted
Sep 30 17:39:42 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf started
Sep 30 17:39:42 compute-1 ceph-mon[75484]: Active manager daemon compute-0.efvthf restarted
Sep 30 17:39:42 compute-1 ceph-mon[75484]: Activating manager daemon compute-0.efvthf
Sep 30 17:39:42 compute-1 ceph-mon[75484]: osdmap e41: 2 total, 2 up, 2 in
Sep 30 17:39:42 compute-1 ceph-mon[75484]: mgrmap e20: compute-0.efvthf(active, starting, since 0.0247954s), standbys: compute-1.glbusf
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-0.efvthf", "id": "compute-0.efvthf"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-1.glbusf", "id": "compute-1.glbusf"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: Manager daemon compute-0.efvthf is now available
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/mirror_snapshot_schedule"}]: dispatch
Sep 30 17:39:42 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/trash_purge_schedule"}]: dispatch
Sep 30 17:39:42 compute-1 sshd-session[83019]: Accepted publickey for ceph-admin from 192.168.122.100 port 51802 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:39:42 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Sep 30 17:39:42 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Sep 30 17:39:42 compute-1 systemd-logind[789]: New session 35 of user ceph-admin.
Sep 30 17:39:43 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Sep 30 17:39:43 compute-1 systemd[1]: Starting User Manager for UID 42477...
Sep 30 17:39:43 compute-1 systemd[83023]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:39:43 compute-1 systemd[83023]: Queued start job for default target Main User Target.
Sep 30 17:39:43 compute-1 systemd[83023]: Created slice User Application Slice.
Sep 30 17:39:43 compute-1 systemd[83023]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 17:39:43 compute-1 systemd[83023]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 17:39:43 compute-1 systemd[83023]: Reached target Paths.
Sep 30 17:39:43 compute-1 systemd[83023]: Reached target Timers.
Sep 30 17:39:43 compute-1 systemd[83023]: Starting D-Bus User Message Bus Socket...
Sep 30 17:39:43 compute-1 systemd[83023]: Starting Create User's Volatile Files and Directories...
Sep 30 17:39:43 compute-1 systemd[83023]: Listening on D-Bus User Message Bus Socket.
Sep 30 17:39:43 compute-1 systemd[83023]: Reached target Sockets.
Sep 30 17:39:43 compute-1 systemd[83023]: Finished Create User's Volatile Files and Directories.
Sep 30 17:39:43 compute-1 systemd[83023]: Reached target Basic System.
Sep 30 17:39:43 compute-1 systemd[83023]: Reached target Main User Target.
Sep 30 17:39:43 compute-1 systemd[83023]: Startup finished in 165ms.
Sep 30 17:39:43 compute-1 systemd[1]: Started User Manager for UID 42477.
Sep 30 17:39:43 compute-1 systemd[1]: Started Session 35 of User ceph-admin.
Sep 30 17:39:43 compute-1 sshd-session[83019]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:39:43 compute-1 sudo[83040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:43 compute-1 sudo[83040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:43 compute-1 sudo[83040]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:43 compute-1 sudo[83065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:39:43 compute-1 sudo[83065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e2 new map
Sep 30 17:39:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e2 print_map
                                           e2
                                           btime 2025-09-30T17:39:43:520234+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:39:43.520183+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Sep 30 17:39:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e42 e42: 2 total, 2 up, 2 in
Sep 30 17:39:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.c scrub starts
Sep 30 17:39:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.c scrub ok
Sep 30 17:39:43 compute-1 ceph-mon[75484]: 5.6 scrub starts
Sep 30 17:39:43 compute-1 ceph-mon[75484]: 5.6 scrub ok
Sep 30 17:39:43 compute-1 ceph-mon[75484]: 3.a scrub starts
Sep 30 17:39:43 compute-1 ceph-mon[75484]: 3.a scrub ok
Sep 30 17:39:43 compute-1 ceph-mon[75484]: mgrmap e21: compute-0.efvthf(active, since 1.04281s), standbys: compute-1.glbusf
Sep 30 17:39:43 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Sep 30 17:39:43 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Sep 30 17:39:43 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Sep 30 17:39:43 compute-1 ceph-mon[75484]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Sep 30 17:39:43 compute-1 ceph-mon[75484]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Sep 30 17:39:43 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Sep 30 17:39:43 compute-1 ceph-mon[75484]: osdmap e42: 2 total, 2 up, 2 in
Sep 30 17:39:43 compute-1 ceph-mon[75484]: fsmap cephfs:0
Sep 30 17:39:43 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:44 compute-1 podman[83164]: 2025-09-30 17:39:44.025910829 +0000 UTC m=+0.075629781 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 17:39:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:44 compute-1 podman[83164]: 2025-09-30 17:39:44.133221993 +0000 UTC m=+0.182940895 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:39:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.b scrub starts
Sep 30 17:39:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.b scrub ok
Sep 30 17:39:44 compute-1 sudo[83065]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:44 compute-1 sudo[83249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:44 compute-1 sudo[83249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:44 compute-1 sudo[83249]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:44 compute-1 sudo[83274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:39:44 compute-1 sudo[83274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:44 compute-1 ceph-mon[75484]: Saving service mds.cephfs spec with placement compute-0;compute-1
Sep 30 17:39:44 compute-1 ceph-mon[75484]: 5.c scrub starts
Sep 30 17:39:44 compute-1 ceph-mon[75484]: 5.c scrub ok
Sep 30 17:39:44 compute-1 ceph-mon[75484]: 4.a scrub starts
Sep 30 17:39:44 compute-1 ceph-mon[75484]: 4.a scrub ok
Sep 30 17:39:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:45 compute-1 sudo[83274]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:45 compute-1 sudo[83330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:45 compute-1 sudo[83330]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:45 compute-1 sudo[83330]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:45 compute-1 sudo[83355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 17:39:45 compute-1 sudo[83355]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:45 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.d scrub starts
Sep 30 17:39:45 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.d scrub ok
Sep 30 17:39:45 compute-1 sudo[83355]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:45 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:44] ENGINE Bus STARTING
Sep 30 17:39:45 compute-1 ceph-mon[75484]: from='client.24125 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:45 compute-1 ceph-mon[75484]: Saving service mds.cephfs spec with placement compute-0;compute-1
Sep 30 17:39:45 compute-1 ceph-mon[75484]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:45 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:44] ENGINE Serving on http://192.168.122.100:8765
Sep 30 17:39:45 compute-1 ceph-mon[75484]: 3.b scrub starts
Sep 30 17:39:45 compute-1 ceph-mon[75484]: 3.b scrub ok
Sep 30 17:39:45 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:44] ENGINE Serving on https://192.168.122.100:7150
Sep 30 17:39:45 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:44] ENGINE Bus STARTED
Sep 30 17:39:45 compute-1 ceph-mon[75484]: [30/Sep/2025:17:39:44] ENGINE Client ('192.168.122.100', 42244) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Sep 30 17:39:45 compute-1 ceph-mon[75484]: 4.d scrub starts
Sep 30 17:39:45 compute-1 ceph-mon[75484]: 4.d scrub ok
Sep 30 17:39:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Sep 30 17:39:45 compute-1 ceph-mon[75484]: mgrmap e22: compute-0.efvthf(active, since 2s), standbys: compute-1.glbusf
Sep 30 17:39:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Sep 30 17:39:45 compute-1 sudo[83398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:39:45 compute-1 sudo[83398]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:45 compute-1 sudo[83398]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:39:46 compute-1 sudo[83423]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83423]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83448]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:46 compute-1 sudo[83473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83473]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83498]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e43 e43: 2 total, 2 up, 2 in
Sep 30 17:39:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 43 pg[8.0( empty local-lis/les=0/0 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:39:46 compute-1 sudo[83546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83546]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83571]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83571]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.b scrub starts
Sep 30 17:39:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.b scrub ok
Sep 30 17:39:46 compute-1 sudo[83596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Sep 30 17:39:46 compute-1 sudo[83596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83596]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:46 compute-1 sudo[83621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83621]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:46 compute-1 sudo[83646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83646]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83671]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-0 to 127.8M
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-0 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='client.14394 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 ", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 17:39:46 compute-1 ceph-mon[75484]: 5.d scrub starts
Sep 30 17:39:46 compute-1 ceph-mon[75484]: 5.d scrub ok
Sep 30 17:39:46 compute-1 ceph-mon[75484]: 4.5 scrub starts
Sep 30 17:39:46 compute-1 ceph-mon[75484]: 4.5 scrub ok
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Adjusting osd_memory_target on compute-1 to 127.8M
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Unable to set osd_memory_target on compute-1 to 134071500: error parsing value: Value '134071500' is below minimum 939524096
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.conf
Sep 30 17:39:46 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.conf
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Sep 30 17:39:46 compute-1 ceph-mon[75484]: osdmap e43: 2 total, 2 up, 2 in
Sep 30 17:39:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Sep 30 17:39:46 compute-1 sudo[83696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:46 compute-1 sudo[83696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83696]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:46 compute-1 sudo[83721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:46 compute-1 sudo[83721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:46 compute-1 sudo[83721]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:47 compute-1 sudo[83769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83769]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:39:47 compute-1 sudo[83794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83794]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:47 compute-1 sudo[83819]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83819]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e44 e44: 2 total, 2 up, 2 in
Sep 30 17:39:47 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 44 pg[8.0( empty local-lis/les=43/44 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:39:47 compute-1 sudo[83844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:39:47 compute-1 sudo[83844]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83844]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:39:47 compute-1 sudo[83869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83869]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:47 compute-1 sudo[83894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83894]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.a scrub starts
Sep 30 17:39:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.a scrub ok
Sep 30 17:39:47 compute-1 sudo[83919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:47 compute-1 sudo[83919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83919]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[83944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:47 compute-1 sudo[83944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83944]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:47 compute-1 ceph-mon[75484]: pgmap v7: 194 pgs: 1 unknown, 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:47 compute-1 ceph-mon[75484]: 4.b scrub starts
Sep 30 17:39:47 compute-1 ceph-mon[75484]: 4.b scrub ok
Sep 30 17:39:47 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:39:47 compute-1 ceph-mon[75484]: 5.7 scrub starts
Sep 30 17:39:47 compute-1 ceph-mon[75484]: 5.7 scrub ok
Sep 30 17:39:47 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:47 compute-1 ceph-mon[75484]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:39:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Sep 30 17:39:47 compute-1 ceph-mon[75484]: osdmap e44: 2 total, 2 up, 2 in
Sep 30 17:39:47 compute-1 ceph-mon[75484]: mgrmap e23: compute-0.efvthf(active, since 4s), standbys: compute-1.glbusf
Sep 30 17:39:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:47 compute-1 sudo[83992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:47 compute-1 sudo[83992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[83992]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:47 compute-1 sudo[84017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:39:47 compute-1 sudo[84017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:47 compute-1 sudo[84017]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:48 compute-1 sudo[84042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84042]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:48 compute-1 sudo[84067]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84067]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:39:48 compute-1 sudo[84092]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84092]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:48 compute-1 sudo[84117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84117]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:48 compute-1 sudo[84142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84142]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e45 e45: 2 total, 2 up, 2 in
Sep 30 17:39:48 compute-1 sudo[84167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:48 compute-1 sudo[84167]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84167]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.b scrub starts
Sep 30 17:39:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.b scrub ok
Sep 30 17:39:48 compute-1 sudo[84215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:48 compute-1 sudo[84215]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84215]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:39:48 compute-1 sudo[84240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84240]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 sudo[84265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:48 compute-1 sudo[84265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:48 compute-1 sudo[84265]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:48 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:39:48 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:48 compute-1 ceph-mon[75484]: Saving service nfs.cephfs spec with placement compute-0;compute-1
Sep 30 17:39:48 compute-1 ceph-mon[75484]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.a scrub starts
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.a scrub ok
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.2 scrub starts
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.2 scrub ok
Sep 30 17:39:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:48 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:39:48 compute-1 ceph-mon[75484]: osdmap e45: 2 total, 2 up, 2 in
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.b scrub starts
Sep 30 17:39:48 compute-1 ceph-mon[75484]: 5.b scrub ok
Sep 30 17:39:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Sep 30 17:39:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Sep 30 17:39:49 compute-1 ceph-mon[75484]: pgmap v10: 194 pgs: 1 unknown, 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:39:49 compute-1 ceph-mon[75484]: 3.d scrub starts
Sep 30 17:39:49 compute-1 ceph-mon[75484]: 3.d scrub ok
Sep 30 17:39:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:49 compute-1 ceph-mon[75484]: Deploying daemon node-exporter.compute-0 on compute-0
Sep 30 17:39:49 compute-1 ceph-mon[75484]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Sep 30 17:39:49 compute-1 ceph-mon[75484]: mgrmap e24: compute-0.efvthf(active, since 6s), standbys: compute-1.glbusf
Sep 30 17:39:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1719518272' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Sep 30 17:39:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1719518272' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Sep 30 17:39:49 compute-1 ceph-mon[75484]: 5.8 scrub starts
Sep 30 17:39:49 compute-1 ceph-mon[75484]: 5.8 scrub ok
Sep 30 17:39:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Sep 30 17:39:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Sep 30 17:39:50 compute-1 ceph-mon[75484]: 3.3 deep-scrub starts
Sep 30 17:39:50 compute-1 ceph-mon[75484]: 3.3 deep-scrub ok
Sep 30 17:39:50 compute-1 ceph-mon[75484]: 4.17 scrub starts
Sep 30 17:39:50 compute-1 ceph-mon[75484]: 4.17 scrub ok
Sep 30 17:39:51 compute-1 sudo[84290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:51 compute-1 sudo[84290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:51 compute-1 sudo[84290]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Sep 30 17:39:51 compute-1 sudo[84315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Sep 30 17:39:51 compute-1 sudo[84315]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:52 compute-1 ceph-mon[75484]: pgmap v11: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 5.15 deep-scrub starts
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 5.15 deep-scrub ok
Sep 30 17:39:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1765872891' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Sep 30 17:39:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 3.c scrub starts
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 3.c scrub ok
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 4.16 scrub starts
Sep 30 17:39:52 compute-1 ceph-mon[75484]: 4.16 scrub ok
Sep 30 17:39:52 compute-1 systemd[1]: Reloading.
Sep 30 17:39:52 compute-1 systemd-rc-local-generator[84404]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:39:52 compute-1 systemd-sysv-generator[84409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:39:52 compute-1 systemd[1]: Reloading.
Sep 30 17:39:52 compute-1 systemd-sysv-generator[84455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:39:52 compute-1 systemd-rc-local-generator[84451]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:39:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Sep 30 17:39:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Sep 30 17:39:52 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:39:52 compute-1 bash[84505]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Sep 30 17:39:53 compute-1 ceph-mon[75484]: Deploying daemon node-exporter.compute-1 on compute-1
Sep 30 17:39:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2559034079' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 17:39:53 compute-1 ceph-mon[75484]: 5.17 scrub starts
Sep 30 17:39:53 compute-1 ceph-mon[75484]: 5.1 scrub starts
Sep 30 17:39:53 compute-1 ceph-mon[75484]: 5.17 scrub ok
Sep 30 17:39:53 compute-1 ceph-mon[75484]: 5.1 scrub ok
Sep 30 17:39:53 compute-1 bash[84505]: Getting image source signatures
Sep 30 17:39:53 compute-1 bash[84505]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Sep 30 17:39:53 compute-1 bash[84505]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Sep 30 17:39:53 compute-1 bash[84505]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Sep 30 17:39:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Sep 30 17:39:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Sep 30 17:39:54 compute-1 bash[84505]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Sep 30 17:39:54 compute-1 bash[84505]: Writing manifest to image destination
Sep 30 17:39:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:54 compute-1 ceph-mon[75484]: pgmap v12: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 11 op/s
Sep 30 17:39:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/659304667' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Sep 30 17:39:54 compute-1 ceph-mon[75484]: 5.4 scrub starts
Sep 30 17:39:54 compute-1 ceph-mon[75484]: 3.12 scrub starts
Sep 30 17:39:54 compute-1 ceph-mon[75484]: 5.4 scrub ok
Sep 30 17:39:54 compute-1 ceph-mon[75484]: 3.12 scrub ok
Sep 30 17:39:54 compute-1 podman[84505]: 2025-09-30 17:39:54.112111 +0000 UTC m=+1.259749589 container create 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:39:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d33423474cdf8768b8c7a522dee38b33a6b53f4d7e02b070a55f7ae60c1897d/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Sep 30 17:39:54 compute-1 podman[84505]: 2025-09-30 17:39:54.086251995 +0000 UTC m=+1.233890564 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Sep 30 17:39:54 compute-1 podman[84505]: 2025-09-30 17:39:54.186491816 +0000 UTC m=+1.334130445 container init 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:39:54 compute-1 podman[84505]: 2025-09-30 17:39:54.19544046 +0000 UTC m=+1.343079049 container start 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:39:54 compute-1 bash[84505]: 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.207Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.207Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.209Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.209Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.210Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.210Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=arp
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=bcache
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=bonding
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=btrfs
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=conntrack
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=cpu
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=cpufreq
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=diskstats
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=dmi
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=edac
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.211Z caller=node_exporter.go:117 level=info collector=entropy
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=fibrechannel
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=filefd
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=filesystem
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=hwmon
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=infiniband
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=ipvs
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=loadavg
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=mdadm
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=meminfo
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=netclass
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=netdev
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=netstat
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=nfs
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=nfsd
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=nvme
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=os
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=pressure
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=rapl
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=schedstat
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=selinux
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=sockstat
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=softnet
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=stat
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=tapestats
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=textfile
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=thermal_zone
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=time
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=udp_queues
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=uname
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=vmstat
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=xfs
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.212Z caller=node_exporter.go:117 level=info collector=zfs
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.213Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Sep 30 17:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1[84583]: ts=2025-09-30T17:39:54.213Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Sep 30 17:39:54 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:39:54 compute-1 sudo[84315]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Sep 30 17:39:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:39:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:55 compute-1 ceph-mon[75484]: pgmap v13: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Sep 30 17:39:55 compute-1 ceph-mon[75484]: 5.14 scrub starts
Sep 30 17:39:55 compute-1 ceph-mon[75484]: 5.14 scrub ok
Sep 30 17:39:55 compute-1 ceph-mon[75484]: 3.5 scrub starts
Sep 30 17:39:55 compute-1 ceph-mon[75484]: 3.5 scrub ok
Sep 30 17:39:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Sep 30 17:39:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Sep 30 17:39:56 compute-1 ceph-mon[75484]: from='client.14418 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Sep 30 17:39:56 compute-1 ceph-mon[75484]: 4.14 scrub starts
Sep 30 17:39:56 compute-1 ceph-mon[75484]: 4.14 scrub ok
Sep 30 17:39:56 compute-1 ceph-mon[75484]: 3.9 scrub starts
Sep 30 17:39:56 compute-1 ceph-mon[75484]: 3.9 scrub ok
Sep 30 17:39:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Sep 30 17:39:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Sep 30 17:39:57 compute-1 ceph-mon[75484]: pgmap v14: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 0 B/s wr, 7 op/s
Sep 30 17:39:57 compute-1 ceph-mon[75484]: 5.12 scrub starts
Sep 30 17:39:57 compute-1 ceph-mon[75484]: 5.12 scrub ok
Sep 30 17:39:57 compute-1 ceph-mon[75484]: 5.f scrub starts
Sep 30 17:39:57 compute-1 ceph-mon[75484]: 5.f scrub ok
Sep 30 17:39:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Sep 30 17:39:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Sep 30 17:39:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Sep 30 17:39:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='client.14422 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Sep 30 17:39:58 compute-1 ceph-mon[75484]: 4.12 scrub starts
Sep 30 17:39:58 compute-1 ceph-mon[75484]: 4.12 scrub ok
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:58 compute-1 ceph-mon[75484]: 4.1 scrub starts
Sep 30 17:39:58 compute-1 ceph-mon[75484]: 4.1 scrub ok
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.csizwd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.csizwd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:39:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:39:58 compute-1 sudo[84592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:39:58 compute-1 sudo[84592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:58 compute-1 sudo[84592]: pam_unix(sudo:session): session closed for user root
Sep 30 17:39:58 compute-1 sudo[84617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:39:58 compute-1 sudo[84617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:39:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.325066547 +0000 UTC m=+0.071133919 container create af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325)
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.295101411 +0000 UTC m=+0.041168843 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:39:59 compute-1 systemd[1]: Started libpod-conmon-af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80.scope.
Sep 30 17:39:59 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.448702125 +0000 UTC m=+0.194769557 container init af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True)
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.45808261 +0000 UTC m=+0.204149952 container start af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.461517134 +0000 UTC m=+0.207584536 container attach af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:39:59 compute-1 systemd[1]: libpod-af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80.scope: Deactivated successfully.
Sep 30 17:39:59 compute-1 hungry_hofstadter[84700]: 167 167
Sep 30 17:39:59 compute-1 conmon[84700]: conmon af913881f8e57d3cd717 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80.scope/container/memory.events
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.467820386 +0000 UTC m=+0.213887758 container died af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:39:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-314911106d1a2a46d86d269a189fff879dca42df6f58e1b2179814a8429e6c7d-merged.mount: Deactivated successfully.
Sep 30 17:39:59 compute-1 podman[84684]: 2025-09-30 17:39:59.514269631 +0000 UTC m=+0.260336973 container remove af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hofstadter, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 17:39:59 compute-1 systemd[1]: libpod-conmon-af913881f8e57d3cd717f872a2336994c61c2fcb71716fc6f1d351ffc88e5a80.scope: Deactivated successfully.
Sep 30 17:39:59 compute-1 systemd[1]: Reloading.
Sep 30 17:39:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.11 deep-scrub starts
Sep 30 17:39:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.11 deep-scrub ok
Sep 30 17:39:59 compute-1 ceph-mon[75484]: pgmap v15: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 0 B/s wr, 7 op/s
Sep 30 17:39:59 compute-1 ceph-mon[75484]: Deploying daemon rgw.rgw.compute-1.csizwd on compute-1
Sep 30 17:39:59 compute-1 ceph-mon[75484]: 5.13 scrub starts
Sep 30 17:39:59 compute-1 ceph-mon[75484]: 5.13 scrub ok
Sep 30 17:39:59 compute-1 ceph-mon[75484]: 5.e scrub starts
Sep 30 17:39:59 compute-1 ceph-mon[75484]: 5.e scrub ok
Sep 30 17:39:59 compute-1 systemd-sysv-generator[84747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:39:59 compute-1 systemd-rc-local-generator[84743]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:39:59 compute-1 systemd[1]: Reloading.
Sep 30 17:39:59 compute-1 systemd-rc-local-generator[84779]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:39:59 compute-1 systemd-sysv-generator[84785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:00 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.csizwd for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:00 compute-1 podman[84844]: 2025-09-30 17:40:00.494072631 +0000 UTC m=+0.064824267 container create 19bcb4b8f3d778b7a18a7d236044fed0f56fc5d48f74dc81e69aec903cea78ed (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-rgw-rgw-compute-1-csizwd, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:40:00 compute-1 podman[84844]: 2025-09-30 17:40:00.461992647 +0000 UTC m=+0.032744333 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:40:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/811e926e6bab05171e836e42ca2e84be170236cc4dcba8cbdfbcc5c1d07703f8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/811e926e6bab05171e836e42ca2e84be170236cc4dcba8cbdfbcc5c1d07703f8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/811e926e6bab05171e836e42ca2e84be170236cc4dcba8cbdfbcc5c1d07703f8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/811e926e6bab05171e836e42ca2e84be170236cc4dcba8cbdfbcc5c1d07703f8/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.csizwd supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:00 compute-1 podman[84844]: 2025-09-30 17:40:00.578285025 +0000 UTC m=+0.149036711 container init 19bcb4b8f3d778b7a18a7d236044fed0f56fc5d48f74dc81e69aec903cea78ed (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-rgw-rgw-compute-1-csizwd, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:40:00 compute-1 podman[84844]: 2025-09-30 17:40:00.58800581 +0000 UTC m=+0.158757446 container start 19bcb4b8f3d778b7a18a7d236044fed0f56fc5d48f74dc81e69aec903cea78ed (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-rgw-rgw-compute-1-csizwd, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Sep 30 17:40:00 compute-1 bash[84844]: 19bcb4b8f3d778b7a18a7d236044fed0f56fc5d48f74dc81e69aec903cea78ed
Sep 30 17:40:00 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.csizwd for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Sep 30 17:40:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Sep 30 17:40:00 compute-1 ceph-mon[75484]: from='client.14426 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Sep 30 17:40:00 compute-1 ceph-mon[75484]: 4.11 deep-scrub starts
Sep 30 17:40:00 compute-1 ceph-mon[75484]: 4.11 deep-scrub ok
Sep 30 17:40:00 compute-1 ceph-mon[75484]: 4.c scrub starts
Sep 30 17:40:00 compute-1 ceph-mon[75484]: 4.c scrub ok
Sep 30 17:40:00 compute-1 ceph-mon[75484]: Health detail: HEALTH_ERR 1 OSD(s) experiencing slow operations in BlueStore; 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds
Sep 30 17:40:00 compute-1 ceph-mon[75484]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 17:40:00 compute-1 ceph-mon[75484]:      osd.1 observed slow operation indications in BlueStore
Sep 30 17:40:00 compute-1 ceph-mon[75484]: [ERR] MDS_ALL_DOWN: 1 filesystem is offline
Sep 30 17:40:00 compute-1 ceph-mon[75484]:     fs cephfs is offline because no MDS is active for it.
Sep 30 17:40:00 compute-1 ceph-mon[75484]: [WRN] MDS_UP_LESS_THAN_MAX: 1 filesystem is online with fewer MDS than max_mds
Sep 30 17:40:00 compute-1 ceph-mon[75484]:     fs cephfs has 0 MDS online, but wants 1
Sep 30 17:40:00 compute-1 radosgw[84864]: deferred set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:40:00 compute-1 radosgw[84864]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Sep 30 17:40:00 compute-1 radosgw[84864]: framework: beast
Sep 30 17:40:00 compute-1 radosgw[84864]: framework conf key: endpoint, val: 192.168.122.101:8082
Sep 30 17:40:00 compute-1 radosgw[84864]: init_numa not setting numa affinity
Sep 30 17:40:00 compute-1 sudo[84617]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:01 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Sep 30 17:40:01 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Sep 30 17:40:01 compute-1 ceph-mon[75484]: pgmap v16: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 15 KiB/s rd, 0 B/s wr, 6 op/s
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='client.14430 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Sep 30 17:40:01 compute-1 ceph-mon[75484]: 4.10 scrub starts
Sep 30 17:40:01 compute-1 ceph-mon[75484]: 4.10 scrub ok
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:01 compute-1 ceph-mon[75484]: 4.e scrub starts
Sep 30 17:40:01 compute-1 ceph-mon[75484]: 4.e scrub ok
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mewauo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mewauo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:01 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:01 compute-1 ceph-mon[75484]: Deploying daemon rgw.rgw.compute-0.mewauo on compute-0
Sep 30 17:40:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e46 e46: 2 total, 2 up, 2 in
Sep 30 17:40:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 46 pg[9.0( empty local-lis/les=0/0 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [1] r=0 lpr=46 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Sep 30 17:40:01 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/423346448' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Sep 30 17:40:02 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1e deep-scrub starts
Sep 30 17:40:02 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 4.1e deep-scrub ok
Sep 30 17:40:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e47 e47: 2 total, 2 up, 2 in
Sep 30 17:40:02 compute-1 ceph-mon[75484]: 5.1e scrub starts
Sep 30 17:40:02 compute-1 ceph-mon[75484]: 5.1e scrub ok
Sep 30 17:40:02 compute-1 ceph-mon[75484]: 3.1a deep-scrub starts
Sep 30 17:40:02 compute-1 ceph-mon[75484]: 3.1a deep-scrub ok
Sep 30 17:40:02 compute-1 ceph-mon[75484]: osdmap e46: 2 total, 2 up, 2 in
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/423346448' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.vrwlru", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.vrwlru", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2117407937' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Sep 30 17:40:02 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 47 pg[9.0( empty local-lis/les=46/47 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [1] r=0 lpr=46 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:02 compute-1 radosgw[84864]: rgw main: failed to create zone with (17) File exists
Sep 30 17:40:02 compute-1 radosgw[84864]: rgw main: failed to create zonegroup with (17) File exists
Sep 30 17:40:03 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Sep 30 17:40:03 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Sep 30 17:40:03 compute-1 ceph-mon[75484]: Saving service rgw.rgw spec with placement compute-0;compute-1
Sep 30 17:40:03 compute-1 ceph-mon[75484]: Deploying daemon mds.cephfs.compute-0.vrwlru on compute-0
Sep 30 17:40:03 compute-1 ceph-mon[75484]: pgmap v18: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:03 compute-1 ceph-mon[75484]: 4.1e deep-scrub starts
Sep 30 17:40:03 compute-1 ceph-mon[75484]: 4.1e deep-scrub ok
Sep 30 17:40:03 compute-1 ceph-mon[75484]: 3.1d scrub starts
Sep 30 17:40:03 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Sep 30 17:40:03 compute-1 ceph-mon[75484]: osdmap e47: 2 total, 2 up, 2 in
Sep 30 17:40:03 compute-1 ceph-mon[75484]: 3.1d scrub ok
Sep 30 17:40:03 compute-1 ceph-mon[75484]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Sep 30 17:40:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e48 e48: 2 total, 2 up, 2 in
Sep 30 17:40:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Sep 30 17:40:03 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Sep 30 17:40:03 compute-1 sudo[85451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:03 compute-1 sudo[85451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:03 compute-1 sudo[85451]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:04 compute-1 sudo[85476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:04 compute-1 sudo[85476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e3 new map
Sep 30 17:40:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e3 print_map
                                           e3
                                           btime 2025-09-30T17:40:04:127296+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        3
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:04.127286+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:creating seq 1 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.497246202 +0000 UTC m=+0.045698076 container create 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:40:04 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Sep 30 17:40:04 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Sep 30 17:40:04 compute-1 systemd[1]: Started libpod-conmon-9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a.scope.
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.481207045 +0000 UTC m=+0.029658919 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:40:04 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.601776189 +0000 UTC m=+0.150228083 container init 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.612587244 +0000 UTC m=+0.161039118 container start 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS)
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.616686886 +0000 UTC m=+0.165138780 container attach 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Sep 30 17:40:04 compute-1 amazing_stonebraker[85558]: 167 167
Sep 30 17:40:04 compute-1 systemd[1]: libpod-9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a.scope: Deactivated successfully.
Sep 30 17:40:04 compute-1 conmon[85558]: conmon 9f0fa7e14988d7316607 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a.scope/container/memory.events
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.622859204 +0000 UTC m=+0.171311088 container died 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:40:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-4022c3ce7c31dd69f6fb1d9de85b02f92b33ad0b25cfe141457b2d88217e926a-merged.mount: Deactivated successfully.
Sep 30 17:40:04 compute-1 podman[85542]: 2025-09-30 17:40:04.67225896 +0000 UTC m=+0.220710864 container remove 9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_stonebraker, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 17:40:04 compute-1 systemd[1]: libpod-conmon-9f0fa7e14988d7316607708ff1d01770c03861ac6ac26c33690bbc7f1cb3616a.scope: Deactivated successfully.
Sep 30 17:40:04 compute-1 systemd[1]: Reloading.
Sep 30 17:40:04 compute-1 systemd-rc-local-generator[85603]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:04 compute-1 systemd-sysv-generator[85607]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e49 e49: 2 total, 2 up, 2 in
Sep 30 17:40:04 compute-1 ceph-mon[75484]: 2.15 scrub starts
Sep 30 17:40:04 compute-1 ceph-mon[75484]: 2.15 scrub ok
Sep 30 17:40:04 compute-1 ceph-mon[75484]: 4.1a deep-scrub starts
Sep 30 17:40:04 compute-1 ceph-mon[75484]: 4.1a deep-scrub ok
Sep 30 17:40:04 compute-1 ceph-mon[75484]: osdmap e48: 2 total, 2 up, 2 in
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.wibdub", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2089148134' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.wibdub", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: Deploying daemon mds.cephfs.compute-1.wibdub on compute-1
Sep 30 17:40:04 compute-1 ceph-mon[75484]: daemon mds.cephfs.compute-0.vrwlru assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Sep 30 17:40:04 compute-1 ceph-mon[75484]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Sep 30 17:40:04 compute-1 ceph-mon[75484]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Sep 30 17:40:04 compute-1 ceph-mon[75484]: mds.? [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] up:boot
Sep 30 17:40:04 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:creating}
Sep 30 17:40:04 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.vrwlru"}]: dispatch
Sep 30 17:40:04 compute-1 ceph-mon[75484]: daemon mds.cephfs.compute-0.vrwlru is now active in filesystem cephfs as rank 0
Sep 30 17:40:05 compute-1 systemd[1]: Reloading.
Sep 30 17:40:05 compute-1 systemd-rc-local-generator[85641]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:05 compute-1 systemd-sysv-generator[85646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e4 new map
Sep 30 17:40:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e4 print_map
                                           e4
                                           btime 2025-09-30T17:40:05:134309+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:05.134306+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 14458 members: 14458
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:active seq 2 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Sep 30 17:40:05 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.wibdub for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:05 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.e scrub starts
Sep 30 17:40:05 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.e scrub ok
Sep 30 17:40:05 compute-1 podman[85705]: 2025-09-30 17:40:05.6670827 +0000 UTC m=+0.056218073 container create fad9a5d85a2e742fef6a7498b7acdf75b69c2bc096128e191965e17eb6c95151 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mds-cephfs-compute-1-wibdub, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:40:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a38b248f6b7d2d92e9fa76cdf6ed2a980004c7e1aaf4d3dcc80b9932a7a43a4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a38b248f6b7d2d92e9fa76cdf6ed2a980004c7e1aaf4d3dcc80b9932a7a43a4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a38b248f6b7d2d92e9fa76cdf6ed2a980004c7e1aaf4d3dcc80b9932a7a43a4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a38b248f6b7d2d92e9fa76cdf6ed2a980004c7e1aaf4d3dcc80b9932a7a43a4/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.wibdub supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:05 compute-1 podman[85705]: 2025-09-30 17:40:05.636368033 +0000 UTC m=+0.025503446 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:40:05 compute-1 podman[85705]: 2025-09-30 17:40:05.745410654 +0000 UTC m=+0.134546057 container init fad9a5d85a2e742fef6a7498b7acdf75b69c2bc096128e191965e17eb6c95151 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mds-cephfs-compute-1-wibdub, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Sep 30 17:40:05 compute-1 podman[85705]: 2025-09-30 17:40:05.757250656 +0000 UTC m=+0.146386019 container start fad9a5d85a2e742fef6a7498b7acdf75b69c2bc096128e191965e17eb6c95151 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mds-cephfs-compute-1-wibdub, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:40:05 compute-1 bash[85705]: fad9a5d85a2e742fef6a7498b7acdf75b69c2bc096128e191965e17eb6c95151
Sep 30 17:40:05 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.wibdub for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:05 compute-1 ceph-mds[85725]: set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:40:05 compute-1 ceph-mds[85725]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Sep 30 17:40:05 compute-1 ceph-mds[85725]: main not setting numa affinity
Sep 30 17:40:05 compute-1 ceph-mds[85725]: pidfile_write: ignore empty --pid-file
Sep 30 17:40:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mds-cephfs-compute-1-wibdub[85721]: starting mds.cephfs.compute-1.wibdub at 
Sep 30 17:40:05 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Updating MDS map to version 4 from mon.1
Sep 30 17:40:05 compute-1 sudo[85476]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:05 compute-1 ceph-mon[75484]: pgmap v21: 196 pgs: 1 unknown, 195 active+clean; 450 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 3.0 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 2.19 scrub starts
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 2.19 scrub ok
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 5.1a scrub starts
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 5.1a scrub ok
Sep 30 17:40:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Sep 30 17:40:05 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Sep 30 17:40:05 compute-1 ceph-mon[75484]: osdmap e49: 2 total, 2 up, 2 in
Sep 30 17:40:05 compute-1 ceph-mon[75484]: mds.? [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] up:active
Sep 30 17:40:05 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:active}
Sep 30 17:40:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2030319904' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 2.e scrub starts
Sep 30 17:40:05 compute-1 ceph-mon[75484]: 2.e scrub ok
Sep 30 17:40:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e50 e50: 2 total, 2 up, 2 in
Sep 30 17:40:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Sep 30 17:40:05 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Sep 30 17:40:05 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 50 pg[11.0( empty local-lis/les=0/0 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e5 new map
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e5 print_map
                                           e5
                                           btime 2025-09-30T17:40:06:167091+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:05.134306+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 14458 members: 14458
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:active seq 2 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.wibdub{-1:24145} state up:standby seq 1 addr [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] compat {c=[1],r=[1],i=[1fff]}]
Sep 30 17:40:06 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Updating MDS map to version 5 from mon.1
Sep 30 17:40:06 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Monitors have assigned me to become a standby
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e6 new map
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e6 print_map
                                           e6
                                           btime 2025-09-30T17:40:06:178958+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:05.134306+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 14458 members: 14458
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:active seq 2 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.wibdub{-1:24145} state up:standby seq 1 addr [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] compat {c=[1],r=[1],i=[1fff]}]
Sep 30 17:40:06 compute-1 sudo[85744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:06 compute-1 sudo[85744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:06 compute-1 sudo[85744]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:06 compute-1 sudo[85770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:06 compute-1 sudo[85770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:06 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.d scrub starts
Sep 30 17:40:06 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.d scrub ok
Sep 30 17:40:06 compute-1 podman[85837]: 2025-09-30 17:40:06.876215895 +0000 UTC m=+0.065176113 container create 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e51 e51: 2 total, 2 up, 2 in
Sep 30 17:40:06 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 51 pg[11.0( empty local-lis/les=50/51 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:06 compute-1 ceph-mon[75484]: 3.1c scrub starts
Sep 30 17:40:06 compute-1 ceph-mon[75484]: 3.1c scrub ok
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: osdmap e50: 2 total, 2 up, 2 in
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:06 compute-1 ceph-mon[75484]: Creating key for client.nfs.cephfs.0.0.compute-1.bsnzkg
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bsnzkg", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bsnzkg", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Sep 30 17:40:06 compute-1 ceph-mon[75484]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bsnzkg-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bsnzkg-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: mds.? [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] up:boot
Sep 30 17:40:06 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:active} 1 up:standby
Sep 30 17:40:06 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.wibdub"}]: dispatch
Sep 30 17:40:06 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:active} 1 up:standby
Sep 30 17:40:06 compute-1 ceph-mon[75484]: 2.d scrub starts
Sep 30 17:40:06 compute-1 ceph-mon[75484]: 2.d scrub ok
Sep 30 17:40:06 compute-1 systemd[1]: Started libpod-conmon-1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8.scope.
Sep 30 17:40:06 compute-1 podman[85837]: 2025-09-30 17:40:06.842990513 +0000 UTC m=+0.031950791 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:40:06 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:06 compute-1 podman[85837]: 2025-09-30 17:40:06.985591094 +0000 UTC m=+0.174551352 container init 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Sep 30 17:40:07 compute-1 podman[85837]: 2025-09-30 17:40:06.999838411 +0000 UTC m=+0.188798589 container start 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0)
Sep 30 17:40:07 compute-1 podman[85837]: 2025-09-30 17:40:07.003397318 +0000 UTC m=+0.192357576 container attach 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Sep 30 17:40:07 compute-1 festive_kepler[85853]: 167 167
Sep 30 17:40:07 compute-1 systemd[1]: libpod-1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8.scope: Deactivated successfully.
Sep 30 17:40:07 compute-1 conmon[85853]: conmon 1c3e2def36c492a4caac <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8.scope/container/memory.events
Sep 30 17:40:07 compute-1 podman[85837]: 2025-09-30 17:40:07.011162178 +0000 UTC m=+0.200122396 container died 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Sep 30 17:40:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-87bc066dce3c80b29cf66a461a3a2d9f1c1f72b26ceaca9395db094cbe4235be-merged.mount: Deactivated successfully.
Sep 30 17:40:07 compute-1 podman[85837]: 2025-09-30 17:40:07.059058719 +0000 UTC m=+0.248018937 container remove 1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_kepler, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:40:07 compute-1 systemd[1]: libpod-conmon-1c3e2def36c492a4caac7b6777dab455f7f9668ae6a573e5080cbbc6d9879da8.scope: Deactivated successfully.
Sep 30 17:40:07 compute-1 systemd[1]: Reloading.
Sep 30 17:40:07 compute-1 systemd-rc-local-generator[85893]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:07 compute-1 systemd-sysv-generator[85896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Sep 30 17:40:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Sep 30 17:40:07 compute-1 systemd[1]: Reloading.
Sep 30 17:40:07 compute-1 systemd-sysv-generator[85936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:07 compute-1 systemd-rc-local-generator[85930]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:07 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:07 compute-1 ceph-mon[75484]: Rados config object exists: conf-nfs.cephfs
Sep 30 17:40:07 compute-1 ceph-mon[75484]: Creating key for client.nfs.cephfs.0.0.compute-1.bsnzkg-rgw
Sep 30 17:40:07 compute-1 ceph-mon[75484]: Bind address in nfs.cephfs.0.0.compute-1.bsnzkg's ganesha conf is defaulting to empty
Sep 30 17:40:07 compute-1 ceph-mon[75484]: Deploying daemon nfs.cephfs.0.0.compute-1.bsnzkg on compute-1
Sep 30 17:40:07 compute-1 ceph-mon[75484]: pgmap v24: 197 pgs: 2 unknown, 195 active+clean; 450 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 2.0 KiB/s wr, 8 op/s
Sep 30 17:40:07 compute-1 ceph-mon[75484]: 5.1b deep-scrub starts
Sep 30 17:40:07 compute-1 ceph-mon[75484]: 5.1b deep-scrub ok
Sep 30 17:40:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Sep 30 17:40:07 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Sep 30 17:40:07 compute-1 ceph-mon[75484]: osdmap e51: 2 total, 2 up, 2 in
Sep 30 17:40:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/654377670' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Sep 30 17:40:07 compute-1 ceph-mon[75484]: 2.10 scrub starts
Sep 30 17:40:07 compute-1 ceph-mon[75484]: 2.10 scrub ok
Sep 30 17:40:07 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e52 e52: 2 total, 2 up, 2 in
Sep 30 17:40:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Sep 30 17:40:07 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Sep 30 17:40:08 compute-1 podman[85994]: 2025-09-30 17:40:08.132253323 +0000 UTC m=+0.051060498 container create a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:40:08 compute-1 podman[85994]: 2025-09-30 17:40:08.11225939 +0000 UTC m=+0.031066545 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cda0b17b54b5bce24502617a6d6723de9dd36b9e2d0a1d7df85cd62cc7b0d9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cda0b17b54b5bce24502617a6d6723de9dd36b9e2d0a1d7df85cd62cc7b0d9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cda0b17b54b5bce24502617a6d6723de9dd36b9e2d0a1d7df85cd62cc7b0d9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cda0b17b54b5bce24502617a6d6723de9dd36b9e2d0a1d7df85cd62cc7b0d9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:08 compute-1 podman[85994]: 2025-09-30 17:40:08.258581633 +0000 UTC m=+0.177388858 container init a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:40:08 compute-1 podman[85994]: 2025-09-30 17:40:08.268541974 +0000 UTC m=+0.187349149 container start a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default)
Sep 30 17:40:08 compute-1 bash[85994]: a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b
Sep 30 17:40:08 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:40:08 compute-1 sudo[85770]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:40:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.c scrub starts
Sep 30 17:40:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.c scrub ok
Sep 30 17:40:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e53 e53: 2 total, 2 up, 2 in
Sep 30 17:40:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e7 new map
Sep 30 17:40:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e7 print_map
                                           e7
                                           btime 2025-09-30T17:40:08:924517+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        7
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:08.164750+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 14458 members: 14458
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.wibdub{-1:24145} state up:standby seq 1 addr [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] compat {c=[1],r=[1],i=[1fff]}]
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 5.1c deep-scrub starts
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 5.1c deep-scrub ok
Sep 30 17:40:08 compute-1 ceph-mon[75484]: osdmap e52: 2 total, 2 up, 2 in
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-0.syzvbh", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-0.syzvbh", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Sep 30 17:40:08 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 2.c scrub starts
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 2.c scrub ok
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 5.18 scrub starts
Sep 30 17:40:08 compute-1 ceph-mon[75484]: 5.18 scrub ok
Sep 30 17:40:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Sep 30 17:40:08 compute-1 ceph-mon[75484]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Sep 30 17:40:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Sep 30 17:40:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Sep 30 17:40:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 e54: 2 total, 2 up, 2 in
Sep 30 17:40:09 compute-1 ceph-mon[75484]: Creating key for client.nfs.cephfs.1.0.compute-0.syzvbh
Sep 30 17:40:09 compute-1 ceph-mon[75484]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Sep 30 17:40:09 compute-1 ceph-mon[75484]: pgmap v27: 198 pgs: 3 unknown, 195 active+clean; 450 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Sep 30 17:40:09 compute-1 ceph-mon[75484]: osdmap e53: 2 total, 2 up, 2 in
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Sep 30 17:40:09 compute-1 ceph-mon[75484]: mds.? [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] up:active
Sep 30 17:40:09 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:active} 1 up:standby
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3982366829' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Sep 30 17:40:09 compute-1 ceph-mon[75484]: 2.13 scrub starts
Sep 30 17:40:09 compute-1 ceph-mon[75484]: 2.13 scrub ok
Sep 30 17:40:09 compute-1 ceph-mon[75484]: 4.1b scrub starts
Sep 30 17:40:09 compute-1 ceph-mon[75484]: 4.1b scrub ok
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3220603395' entity='client.rgw.rgw.compute-0.mewauo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Sep 30 17:40:09 compute-1 ceph-mon[75484]: from='client.? ' entity='client.rgw.rgw.compute-1.csizwd' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Sep 30 17:40:09 compute-1 ceph-mon[75484]: osdmap e54: 2 total, 2 up, 2 in
Sep 30 17:40:10 compute-1 radosgw[84864]: v1 topic migration: starting v1 topic migration..
Sep 30 17:40:10 compute-1 radosgw[84864]: LDAP not started since no server URIs were provided in the configuration.
Sep 30 17:40:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-rgw-rgw-compute-1-csizwd[84860]: 2025-09-30T17:40:10.172+0000 7fdb902fd980 -1 LDAP not started since no server URIs were provided in the configuration.
Sep 30 17:40:10 compute-1 radosgw[84864]: v1 topic migration: finished v1 topic migration
Sep 30 17:40:10 compute-1 radosgw[84864]: framework: beast
Sep 30 17:40:10 compute-1 radosgw[84864]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Sep 30 17:40:10 compute-1 radosgw[84864]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Sep 30 17:40:10 compute-1 radosgw[84864]: starting handler: beast
Sep 30 17:40:10 compute-1 radosgw[84864]: set uid:gid to 167:167 (ceph:ceph)
Sep 30 17:40:10 compute-1 radosgw[84864]: mgrc service_daemon_register rgw.24137 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.csizwd,kernel_description=#1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025,kernel_version=5.14.0-617.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864116,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=cf56360a-b45c-423e-ade6-44203ee5bb4f,zone_name=default,zonegroup_id=db97795f-33ec-4db5-9ea0-da39adf34835,zonegroup_name=default}
Sep 30 17:40:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e8 new map
Sep 30 17:40:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).mds e8 print_map
                                           e8
                                           btime 2025-09-30T17:40:10:422879+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        7
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-09-30T17:39:43.520183+0000
                                           modified        2025-09-30T17:40:08.164750+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14458}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 14458 members: 14458
                                           [mds.cephfs.compute-0.vrwlru{0:14458} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/311246388,v1:192.168.122.100:6807/311246388] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.wibdub{-1:24145} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] compat {c=[1],r=[1],i=[1fff]}]
Sep 30 17:40:10 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Updating MDS map to version 8 from mon.1
Sep 30 17:40:10 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Sep 30 17:40:10 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:11 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:40:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Sep 30 17:40:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Sep 30 17:40:11 compute-1 ceph-mon[75484]: mds.? [v2:192.168.122.101:6804/1687530668,v1:192.168.122.101:6805/1687530668] up:standby
Sep 30 17:40:11 compute-1 ceph-mon[75484]: fsmap cephfs:1 {0=cephfs.compute-0.vrwlru=up:active} 1 up:standby
Sep 30 17:40:11 compute-1 ceph-mon[75484]: 2.1 scrub starts
Sep 30 17:40:11 compute-1 ceph-mon[75484]: 2.1 scrub ok
Sep 30 17:40:11 compute-1 ceph-mon[75484]: pgmap v30: 198 pgs: 1 creating+peering, 197 active+clean; 453 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 4.2 KiB/s rd, 6.2 KiB/s wr, 19 op/s
Sep 30 17:40:12 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Sep 30 17:40:12 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Sep 30 17:40:12 compute-1 ceph-mon[75484]: 4.18 scrub starts
Sep 30 17:40:12 compute-1 ceph-mon[75484]: 4.18 scrub ok
Sep 30 17:40:12 compute-1 ceph-mon[75484]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Sep 30 17:40:12 compute-1 ceph-mon[75484]: 6.1b scrub starts
Sep 30 17:40:12 compute-1 ceph-mon[75484]: 6.1b scrub ok
Sep 30 17:40:12 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Sep 30 17:40:12 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Sep 30 17:40:12 compute-1 ceph-mon[75484]: Rados config object exists: conf-nfs.cephfs
Sep 30 17:40:12 compute-1 ceph-mon[75484]: Creating key for client.nfs.cephfs.1.0.compute-0.syzvbh-rgw
Sep 30 17:40:12 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-0.syzvbh-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Sep 30 17:40:12 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-0.syzvbh-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Sep 30 17:40:12 compute-1 ceph-mon[75484]: Bind address in nfs.cephfs.1.0.compute-0.syzvbh's ganesha conf is defaulting to empty
Sep 30 17:40:12 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:40:12 compute-1 ceph-mon[75484]: Deploying daemon nfs.cephfs.1.0.compute-0.syzvbh on compute-0
Sep 30 17:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:13 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:13 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:13 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:13 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:13 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:40:13 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Sep 30 17:40:13 compute-1 sudo[86098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:13 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Sep 30 17:40:13 compute-1 sudo[86098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:13 compute-1 sudo[86098]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:13 compute-1 sudo[86123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:13 compute-1 sudo[86123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 7.1f scrub starts
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 7.1f scrub ok
Sep 30 17:40:13 compute-1 ceph-mon[75484]: pgmap v31: 198 pgs: 1 creating+peering, 197 active+clean; 453 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 3.0 KiB/s rd, 4.4 KiB/s wr, 14 op/s
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 7.18 scrub starts
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 7.18 scrub ok
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 6.1e scrub starts
Sep 30 17:40:13 compute-1 ceph-mon[75484]: 6.1e scrub ok
Sep 30 17:40:13 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:13 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:13 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:13 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:13 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Sep 30 17:40:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Sep 30 17:40:14 compute-1 ceph-mon[75484]: Deploying daemon haproxy.nfs.cephfs.compute-1.iacknv on compute-1
Sep 30 17:40:14 compute-1 ceph-mon[75484]: 7.1b scrub starts
Sep 30 17:40:14 compute-1 ceph-mon[75484]: 7.1b scrub ok
Sep 30 17:40:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Sep 30 17:40:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Sep 30 17:40:15 compute-1 ceph-mon[75484]: 7.1c deep-scrub starts
Sep 30 17:40:15 compute-1 ceph-mon[75484]: 7.1c deep-scrub ok
Sep 30 17:40:15 compute-1 ceph-mon[75484]: pgmap v32: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 169 KiB/s rd, 10 KiB/s wr, 332 op/s
Sep 30 17:40:15 compute-1 ceph-mon[75484]: 7.1e scrub starts
Sep 30 17:40:15 compute-1 ceph-mon[75484]: 7.1e scrub ok
Sep 30 17:40:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Sep 30 17:40:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Sep 30 17:40:16 compute-1 ceph-mon[75484]: 6.1c deep-scrub starts
Sep 30 17:40:16 compute-1 ceph-mon[75484]: 6.1c deep-scrub ok
Sep 30 17:40:16 compute-1 ceph-mon[75484]: 6.18 scrub starts
Sep 30 17:40:16 compute-1 ceph-mon[75484]: 6.18 scrub ok
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.032688859 +0000 UTC m=+2.905129163 container create f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.011315258 +0000 UTC m=+2.883755562 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Sep 30 17:40:17 compute-1 systemd[1]: Started libpod-conmon-f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591.scope.
Sep 30 17:40:17 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.134932935 +0000 UTC m=+3.007373329 container init f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.144725911 +0000 UTC m=+3.017166245 container start f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.148858463 +0000 UTC m=+3.021298847 container attach f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 confident_boyd[86306]: 0 0
Sep 30 17:40:17 compute-1 systemd[1]: libpod-f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591.scope: Deactivated successfully.
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.152561784 +0000 UTC m=+3.025002118 container died f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-37767063f3a989c98507f65c029f571a6412db52f5bbce2495907fff82a76d7f-merged.mount: Deactivated successfully.
Sep 30 17:40:17 compute-1 podman[86190]: 2025-09-30 17:40:17.206039406 +0000 UTC m=+3.078479740 container remove f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591 (image=quay.io/ceph/haproxy:2.3, name=confident_boyd)
Sep 30 17:40:17 compute-1 systemd[1]: libpod-conmon-f74d1b936c4ec8483b6c2839d8b889927dffc0430a0120e70f6d50467e1f1591.scope: Deactivated successfully.
Sep 30 17:40:17 compute-1 systemd[1]: Reloading.
Sep 30 17:40:17 compute-1 systemd-rc-local-generator[86354]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:17 compute-1 systemd-sysv-generator[86358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.c scrub starts
Sep 30 17:40:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.c scrub ok
Sep 30 17:40:17 compute-1 systemd[1]: Reloading.
Sep 30 17:40:17 compute-1 systemd-rc-local-generator[86394]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:17 compute-1 systemd-sysv-generator[86397]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:17 compute-1 ceph-mon[75484]: 7.12 scrub starts
Sep 30 17:40:17 compute-1 ceph-mon[75484]: 7.12 scrub ok
Sep 30 17:40:17 compute-1 ceph-mon[75484]: 6.1f scrub starts
Sep 30 17:40:17 compute-1 ceph-mon[75484]: 6.1f scrub ok
Sep 30 17:40:17 compute-1 ceph-mon[75484]: pgmap v33: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 139 KiB/s rd, 8.2 KiB/s wr, 273 op/s
Sep 30 17:40:17 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:17 compute-1 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.iacknv for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:18 compute-1 podman[86451]: 2025-09-30 17:40:18.369023827 +0000 UTC m=+0.078168724 container create 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:40:18 compute-1 podman[86451]: 2025-09-30 17:40:18.333835471 +0000 UTC m=+0.042980428 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Sep 30 17:40:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18c0d78b38bca83346f149607af55cddbc2bbe72b133f751827736c2c79dfbe6/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:18 compute-1 podman[86451]: 2025-09-30 17:40:18.448003322 +0000 UTC m=+0.157148259 container init 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:40:18 compute-1 podman[86451]: 2025-09-30 17:40:18.456662287 +0000 UTC m=+0.165807174 container start 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:40:18 compute-1 bash[86451]: 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842
Sep 30 17:40:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1 deep-scrub starts
Sep 30 17:40:18 compute-1 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.iacknv for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [NOTICE] 272/174018 (2) : New worker #1 (4) forked
Sep 30 17:40:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1 deep-scrub ok
Sep 30 17:40:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:18 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69e0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:18 compute-1 sudo[86123]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:18 compute-1 ceph-mon[75484]: 6.12 deep-scrub starts
Sep 30 17:40:18 compute-1 ceph-mon[75484]: 6.12 deep-scrub ok
Sep 30 17:40:18 compute-1 ceph-mon[75484]: 6.c scrub starts
Sep 30 17:40:18 compute-1 ceph-mon[75484]: 6.c scrub ok
Sep 30 17:40:18 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:18 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:18 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.6 deep-scrub starts
Sep 30 17:40:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.6 deep-scrub ok
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.17 deep-scrub starts
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.17 deep-scrub ok
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.1 deep-scrub starts
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.1 deep-scrub ok
Sep 30 17:40:19 compute-1 ceph-mon[75484]: pgmap v34: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 114 KiB/s rd, 4.3 KiB/s wr, 220 op/s
Sep 30 17:40:19 compute-1 ceph-mon[75484]: Deploying daemon haproxy.nfs.cephfs.compute-0.jcdnha on compute-0
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.6 deep-scrub starts
Sep 30 17:40:19 compute-1 ceph-mon[75484]: 6.6 deep-scrub ok
Sep 30 17:40:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Sep 30 17:40:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Sep 30 17:40:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:20 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8000fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:20 compute-1 ceph-mon[75484]: 7.11 scrub starts
Sep 30 17:40:20 compute-1 ceph-mon[75484]: 7.11 scrub ok
Sep 30 17:40:20 compute-1 ceph-mon[75484]: 7.6 scrub starts
Sep 30 17:40:20 compute-1 ceph-mon[75484]: 7.6 scrub ok
Sep 30 17:40:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Sep 30 17:40:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 7.16 scrub starts
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 7.16 scrub ok
Sep 30 17:40:21 compute-1 ceph-mon[75484]: pgmap v35: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 104 KiB/s rd, 3.8 KiB/s wr, 198 op/s
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 7.17 scrub starts
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 7.17 scrub ok
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 6.4 scrub starts
Sep 30 17:40:21 compute-1 ceph-mon[75484]: 6.4 scrub ok
Sep 30 17:40:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8000fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Sep 30 17:40:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Sep 30 17:40:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Sep 30 17:40:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 7.15 scrub starts
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 7.15 scrub ok
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 6.0 scrub starts
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 6.0 scrub ok
Sep 30 17:40:23 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:23 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:23 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:23 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Sep 30 17:40:23 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Sep 30 17:40:23 compute-1 ceph-mon[75484]: pgmap v36: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 91 KiB/s rd, 3.3 KiB/s wr, 175 op/s
Sep 30 17:40:23 compute-1 ceph-mon[75484]: Deploying daemon keepalived.nfs.cephfs.compute-0.miadhc on compute-0
Sep 30 17:40:23 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Sep 30 17:40:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Sep 30 17:40:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 6.15 scrub starts
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 6.15 scrub ok
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 7.3 scrub starts
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 7.3 scrub ok
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 6.a scrub starts
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 6.a scrub ok
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 7.2 deep-scrub starts
Sep 30 17:40:24 compute-1 ceph-mon[75484]: 7.2 deep-scrub ok
Sep 30 17:40:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Sep 30 17:40:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Sep 30 17:40:26 compute-1 ceph-mon[75484]: pgmap v37: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 91 KiB/s rd, 3.2 KiB/s wr, 174 op/s
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 6.8 scrub starts
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 6.8 scrub ok
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 7.4 scrub starts
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 7.4 scrub ok
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 6.7 scrub starts
Sep 30 17:40:26 compute-1 ceph-mon[75484]: 6.7 scrub ok
Sep 30 17:40:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.e scrub starts
Sep 30 17:40:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.e scrub ok
Sep 30 17:40:27 compute-1 ceph-mon[75484]: 7.e scrub starts
Sep 30 17:40:27 compute-1 ceph-mon[75484]: 7.e scrub ok
Sep 30 17:40:27 compute-1 ceph-mon[75484]: 7.5 scrub starts
Sep 30 17:40:27 compute-1 ceph-mon[75484]: 7.5 scrub ok
Sep 30 17:40:27 compute-1 sudo[86483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:27 compute-1 sudo[86483]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:27 compute-1 sudo[86483]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.f scrub starts
Sep 30 17:40:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.f scrub ok
Sep 30 17:40:27 compute-1 sudo[86508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:27 compute-1 sudo[86508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:28 compute-1 ceph-mon[75484]: pgmap v38: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:28 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:28 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:28 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:28 compute-1 ceph-mon[75484]: 7.f scrub starts
Sep 30 17:40:28 compute-1 ceph-mon[75484]: 7.f scrub ok
Sep 30 17:40:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.f scrub starts
Sep 30 17:40:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.f scrub ok
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Sep 30 17:40:29 compute-1 ceph-mon[75484]: Deploying daemon keepalived.nfs.cephfs.compute-1.zmigik on compute-1
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 6.5 scrub starts
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 6.5 scrub ok
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 6.f scrub starts
Sep 30 17:40:29 compute-1 ceph-mon[75484]: 6.f scrub ok
Sep 30 17:40:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Sep 30 17:40:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Sep 30 17:40:30 compute-1 ceph-mon[75484]: pgmap v39: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:30 compute-1 ceph-mon[75484]: 7.0 scrub starts
Sep 30 17:40:30 compute-1 ceph-mon[75484]: 7.0 scrub ok
Sep 30 17:40:30 compute-1 ceph-mon[75484]: 7.8 scrub starts
Sep 30 17:40:30 compute-1 ceph-mon[75484]: 7.8 scrub ok
Sep 30 17:40:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Sep 30 17:40:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.037015053 +0000 UTC m=+3.024269166 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.058735853 +0000 UTC m=+3.045989926 container create 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, distribution-scope=public, release=1793, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2023-02-22T09:23:20, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived)
Sep 30 17:40:31 compute-1 ceph-mon[75484]: 7.7 scrub starts
Sep 30 17:40:31 compute-1 ceph-mon[75484]: 7.7 scrub ok
Sep 30 17:40:31 compute-1 ceph-mon[75484]: 6.9 scrub starts
Sep 30 17:40:31 compute-1 ceph-mon[75484]: 6.9 scrub ok
Sep 30 17:40:31 compute-1 systemd[1]: Started libpod-conmon-907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e.scope.
Sep 30 17:40:31 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.17055999 +0000 UTC m=+3.157814113 container init 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, name=keepalived, description=keepalived for Ceph, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, distribution-scope=public, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.183389858 +0000 UTC m=+3.170643961 container start 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, distribution-scope=public, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.187096609 +0000 UTC m=+3.174350752 container attach 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, release=1793, io.openshift.tags=Ceph keepalived, vcs-type=git, name=keepalived, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=keepalived for Ceph, com.redhat.component=keepalived-container, version=2.2.4, distribution-scope=public)
Sep 30 17:40:31 compute-1 kind_hofstadter[86672]: 0 0
Sep 30 17:40:31 compute-1 systemd[1]: libpod-907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e.scope: Deactivated successfully.
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.193516183 +0000 UTC m=+3.180770286 container died 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=keepalived-container, version=2.2.4, release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Sep 30 17:40:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-278d4fd96241368bb71336147cdab2685e8d3f809385e0413ce93b10309156c0-merged.mount: Deactivated successfully.
Sep 30 17:40:31 compute-1 podman[86575]: 2025-09-30 17:40:31.244505898 +0000 UTC m=+3.231760011 container remove 907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e (image=quay.io/ceph/keepalived:2.2.4, name=kind_hofstadter, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Sep 30 17:40:31 compute-1 systemd[1]: libpod-conmon-907c1b2bd8aca577dd02a99d9d6ed5b4ff8ddcb811a64feed08e2856d7595a2e.scope: Deactivated successfully.
Sep 30 17:40:31 compute-1 systemd[1]: Reloading.
Sep 30 17:40:31 compute-1 systemd-rc-local-generator[86718]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:31 compute-1 systemd-sysv-generator[86723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Sep 30 17:40:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Sep 30 17:40:31 compute-1 systemd[1]: Reloading.
Sep 30 17:40:31 compute-1 systemd-rc-local-generator[86761]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:31 compute-1 systemd-sysv-generator[86764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:32 compute-1 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.zmigik for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:32 compute-1 ceph-mon[75484]: pgmap v40: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:32 compute-1 ceph-mon[75484]: 6.2 scrub starts
Sep 30 17:40:32 compute-1 ceph-mon[75484]: 6.2 scrub ok
Sep 30 17:40:32 compute-1 ceph-mon[75484]: 7.9 scrub starts
Sep 30 17:40:32 compute-1 ceph-mon[75484]: 7.9 scrub ok
Sep 30 17:40:32 compute-1 podman[86816]: 2025-09-30 17:40:32.384797113 +0000 UTC m=+0.062308043 container create 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, version=2.2.4, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, release=1793, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph.)
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:32 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0689de1d33035c36bbe321367dc7124dbc07d9cb78b1ea7ef508211510cc04c2/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:32 compute-1 podman[86816]: 2025-09-30 17:40:32.365343454 +0000 UTC m=+0.042854384 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Sep 30 17:40:32 compute-1 podman[86816]: 2025-09-30 17:40:32.475945568 +0000 UTC m=+0.153456548 container init 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph.)
Sep 30 17:40:32 compute-1 podman[86816]: 2025-09-30 17:40:32.487518482 +0000 UTC m=+0.165029422 container start 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4, architecture=x86_64, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph.)
Sep 30 17:40:32 compute-1 bash[86816]: 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:32 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:32 compute-1 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.zmigik for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Starting Keepalived v2.2.4 (08/21,2021)
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Running on Linux 5.14.0-617.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025 (built for Linux 5.14.0)
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Configuration file /etc/keepalived/keepalived.conf
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Starting VRRP child process, pid=4
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: Startup complete
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: (VI_0) Entering BACKUP STATE (init)
Sep 30 17:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:32 2025: VRRP_Script(check_backend) succeeded
Sep 30 17:40:32 compute-1 sudo[86508]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.b scrub starts
Sep 30 17:40:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.b scrub ok
Sep 30 17:40:33 compute-1 ceph-mon[75484]: 7.1 deep-scrub starts
Sep 30 17:40:33 compute-1 ceph-mon[75484]: 7.1 deep-scrub ok
Sep 30 17:40:33 compute-1 ceph-mon[75484]: pgmap v41: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:33 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:33 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:33 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:33 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:33 compute-1 ceph-mon[75484]: 6.b scrub starts
Sep 30 17:40:33 compute-1 ceph-mon[75484]: Deploying daemon alertmanager.compute-0 on compute-0
Sep 30 17:40:33 compute-1 ceph-mon[75484]: 6.b scrub ok
Sep 30 17:40:33 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.b scrub starts
Sep 30 17:40:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.b scrub ok
Sep 30 17:40:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:34 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:34 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Sep 30 17:40:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Sep 30 17:40:34 compute-1 ceph-mon[75484]: 6.3 deep-scrub starts
Sep 30 17:40:34 compute-1 ceph-mon[75484]: 6.3 deep-scrub ok
Sep 30 17:40:34 compute-1 ceph-mon[75484]: 7.b scrub starts
Sep 30 17:40:34 compute-1 ceph-mon[75484]: 7.b scrub ok
Sep 30 17:40:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.a scrub starts
Sep 30 17:40:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.a scrub ok
Sep 30 17:40:35 compute-1 ceph-mon[75484]: 7.d scrub starts
Sep 30 17:40:35 compute-1 ceph-mon[75484]: 7.d scrub ok
Sep 30 17:40:35 compute-1 ceph-mon[75484]: pgmap v42: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:40:35 compute-1 ceph-mon[75484]: 7.14 scrub starts
Sep 30 17:40:35 compute-1 ceph-mon[75484]: 7.14 scrub ok
Sep 30 17:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:36 2025: (VI_0) Entering MASTER STATE
Sep 30 17:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:36 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Sep 30 17:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:36 2025: (VI_0) Entering BACKUP STATE
Sep 30 17:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:36 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:36 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Sep 30 17:40:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Sep 30 17:40:36 compute-1 ceph-mon[75484]: 6.d scrub starts
Sep 30 17:40:36 compute-1 ceph-mon[75484]: 6.d scrub ok
Sep 30 17:40:36 compute-1 ceph-mon[75484]: 7.a scrub starts
Sep 30 17:40:36 compute-1 ceph-mon[75484]: 7.a scrub ok
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Sep 30 17:40:36 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Sep 30 17:40:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 7.c scrub starts
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 7.c scrub ok
Sep 30 17:40:37 compute-1 ceph-mon[75484]: Regenerating cephadm self-signed grafana TLS certificates
Sep 30 17:40:37 compute-1 ceph-mon[75484]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Sep 30 17:40:37 compute-1 ceph-mon[75484]: Deploying daemon grafana.compute-0 on compute-0
Sep 30 17:40:37 compute-1 ceph-mon[75484]: pgmap v43: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 6.14 scrub starts
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 6.14 scrub ok
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 6.e scrub starts
Sep 30 17:40:37 compute-1 ceph-mon[75484]: 6.e scrub ok
Sep 30 17:40:37 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:38 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:38 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Sep 30 17:40:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Sep 30 17:40:38 compute-1 ceph-mon[75484]: 6.16 scrub starts
Sep 30 17:40:38 compute-1 ceph-mon[75484]: 6.16 scrub ok
Sep 30 17:40:38 compute-1 ceph-mon[75484]: 6.19 scrub starts
Sep 30 17:40:38 compute-1 ceph-mon[75484]: 6.19 scrub ok
Sep 30 17:40:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Sep 30 17:40:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Sep 30 17:40:40 compute-1 ceph-mon[75484]: pgmap v44: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:40 compute-1 ceph-mon[75484]: 7.10 scrub starts
Sep 30 17:40:40 compute-1 ceph-mon[75484]: 7.10 scrub ok
Sep 30 17:40:40 compute-1 ceph-mon[75484]: 7.19 scrub starts
Sep 30 17:40:40 compute-1 ceph-mon[75484]: 7.19 scrub ok
Sep 30 17:40:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:40 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:40 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Sep 30 17:40:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.11 scrub starts
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.11 scrub ok
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 7.1a scrub starts
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 7.1a scrub ok
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.10 scrub starts
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.10 scrub ok
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.1a scrub starts
Sep 30 17:40:41 compute-1 ceph-mon[75484]: 6.1a scrub ok
Sep 30 17:40:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Sep 30 17:40:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Sep 30 17:40:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:42 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:42 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Sep 30 17:40:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Sep 30 17:40:43 compute-1 ceph-mon[75484]: pgmap v45: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:43 compute-1 ceph-mon[75484]: 6.13 scrub starts
Sep 30 17:40:43 compute-1 ceph-mon[75484]: 6.13 scrub ok
Sep 30 17:40:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Sep 30 17:40:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Sep 30 17:40:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e55 e55: 2 total, 2 up, 2 in
Sep 30 17:40:44 compute-1 ceph-mon[75484]: pgmap v46: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:40:44 compute-1 ceph-mon[75484]: 7.13 scrub starts
Sep 30 17:40:44 compute-1 ceph-mon[75484]: 7.13 scrub ok
Sep 30 17:40:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:40:44 compute-1 ceph-mon[75484]: 7.1d scrub starts
Sep 30 17:40:44 compute-1 ceph-mon[75484]: 7.1d scrub ok
Sep 30 17:40:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:40:44 compute-1 ceph-mon[75484]: osdmap e55: 2 total, 2 up, 2 in
Sep 30 17:40:44 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:40:44 compute-1 sshd-session[86845]: Received disconnect from 194.107.115.65 port 16058:11: Bye Bye [preauth]
Sep 30 17:40:44 compute-1 sshd-session[86845]: Disconnected from authenticating user root 194.107.115.65 port 16058 [preauth]
Sep 30 17:40:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:44 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:44 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc0013a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Sep 30 17:40:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Sep 30 17:40:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e56 e56: 2 total, 2 up, 2 in
Sep 30 17:40:44 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 56 pg[8.0( v 54'26 (0'0,54'26] local-lis/les=43/44 n=4 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.393365860s) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 lcod 54'25 mlcod 54'25 active pruub 156.568206787s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:44 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 56 pg[8.0( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.393365860s) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 lcod 54'25 mlcod 0'0 unknown pruub 156.568206787s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1).collection(8.0_head 0x556f33b81b00) operator()   moving buffer(0x556f3390eb68 space 0x556f338e3a10 0x0~1000 clean)
Sep 30 17:40:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:45 compute-1 ceph-mon[75484]: 6.1d scrub starts
Sep 30 17:40:45 compute-1 ceph-mon[75484]: 6.1d scrub ok
Sep 30 17:40:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:40:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:40:45 compute-1 ceph-mon[75484]: osdmap e56: 2 total, 2 up, 2 in
Sep 30 17:40:45 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:40:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e57 e57: 2 total, 2 up, 2 in
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.10( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.14( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.16( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.17( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.15( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.11( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.2( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.3( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.f( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.8( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.9( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.a( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.e( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.d( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.c( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.b( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1( v 54'26 (0'0,54'26] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.7( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.6( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.5( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.4( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1b( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1a( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.19( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.18( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1e( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1f( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1d( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1c( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.13( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.12( v 54'26 lc 0'0 (0'0,54'26] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'25 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.10( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.17( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.16( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.11( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.14( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.15( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.2( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.3( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.f( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:45 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.8( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.9( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.e( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.a( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.d( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.c( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.b( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.0( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 54'25 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.5( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.6( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.7( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.4( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1b( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.19( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1a( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.18( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.13( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1d( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1c( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1e( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.1f( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 57 pg[8.12( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [1] r=0 lpr=56 pi=[43,56)/1 crt=54'26 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:46 compute-1 ceph-mon[75484]: pgmap v48: 198 pgs: 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 204 B/s rd, 0 op/s
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:40:46 compute-1 ceph-mon[75484]: osdmap e57: 2 total, 2 up, 2 in
Sep 30 17:40:46 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:40:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:46 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d00034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:46 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Sep 30 17:40:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Sep 30 17:40:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e58 e58: 2 total, 2 up, 2 in
Sep 30 17:40:47 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 58 pg[9.0( v 47'12 (0'0,47'12] local-lis/les=46/47 n=6 ec=46/46 lis/c=46/46 les/c/f=47/47/0 sis=58 pruub=11.631673813s) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 47'11 mlcod 47'11 active pruub 155.989654541s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:47 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 58 pg[9.0( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=46/46 lis/c=46/46 les/c/f=47/47/0 sis=58 pruub=11.631673813s) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 47'11 mlcod 0'0 unknown pruub 155.989654541s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:47 compute-1 ceph-mon[75484]: Deploying daemon haproxy.rgw.default.compute-0.gretil on compute-0
Sep 30 17:40:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:47 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:47 compute-1 ceph-mon[75484]: 8.10 scrub starts
Sep 30 17:40:47 compute-1 ceph-mon[75484]: 8.10 scrub ok
Sep 30 17:40:47 compute-1 sudo[86855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:47 compute-1 sudo[86855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:47 compute-1 sudo[86855]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:47 compute-1 sudo[86880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:47 compute-1 sudo[86880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Sep 30 17:40:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.164782289 +0000 UTC m=+0.066614950 container create 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e59 e59: 2 total, 2 up, 2 in
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.15( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.14( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.16( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.17( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.10( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.11( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.3( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.2( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.9( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.e( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.8( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.b( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.f( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.c( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.d( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.a( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1( v 47'12 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.6( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.4( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.5( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1a( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1b( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.18( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.7( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.19( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1e( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1f( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1c( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1d( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.12( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.15( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.13( v 47'12 lc 0'0 (0'0,47'12] local-lis/les=46/47 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.14( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-mon[75484]: pgmap v51: 229 pgs: 31 unknown, 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:40:48 compute-1 ceph-mon[75484]: osdmap e58: 2 total, 2 up, 2 in
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:48 compute-1 ceph-mon[75484]: Deploying daemon haproxy.rgw.default.compute-1.adkopy on compute-1
Sep 30 17:40:48 compute-1 ceph-mon[75484]: 8.16 scrub starts
Sep 30 17:40:48 compute-1 ceph-mon[75484]: 8.16 scrub ok
Sep 30 17:40:48 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.16( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.17( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.10( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.2( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.3( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.9( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.11( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.8( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.b( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.f( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.e( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.c( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.0( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=46/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 47'11 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.d( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.a( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.6( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 systemd[1]: Started libpod-conmon-66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f.scope.
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1a( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.5( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1b( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.18( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.7( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1e( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.4( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1f( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1c( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.19( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.12( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.1d( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 59 pg[9.13( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=46/46 les/c/f=47/47/0 sis=58) [1] r=0 lpr=58 pi=[46,58)/1 crt=47'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.138257989 +0000 UTC m=+0.040090680 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Sep 30 17:40:48 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.290129813 +0000 UTC m=+0.191962524 container init 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.302996613 +0000 UTC m=+0.204829274 container start 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.307018002 +0000 UTC m=+0.208850673 container attach 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 jovial_colden[86961]: 0 0
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.314061643 +0000 UTC m=+0.215894334 container died 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 systemd[1]: libpod-66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f.scope: Deactivated successfully.
Sep 30 17:40:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-57a9de9960dd5477da3213aa64d538f4bb8d86da97079b0f3940ce0ab763f0a8-merged.mount: Deactivated successfully.
Sep 30 17:40:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:48 compute-1 podman[86944]: 2025-09-30 17:40:48.367046982 +0000 UTC m=+0.268879643 container remove 66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f (image=quay.io/ceph/haproxy:2.3, name=jovial_colden)
Sep 30 17:40:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 17:40:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:48.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 17:40:48 compute-1 systemd[1]: libpod-conmon-66aa2b2b5e8720fee074927636bf77ef9df327a84e2f99706d316f24ba70603f.scope: Deactivated successfully.
Sep 30 17:40:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:48 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8003880 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:48 compute-1 systemd[1]: Reloading.
Sep 30 17:40:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:48 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:48 compute-1 systemd-rc-local-generator[87005]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:48 compute-1 systemd-sysv-generator[87011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Sep 30 17:40:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Sep 30 17:40:48 compute-1 systemd[1]: Reloading.
Sep 30 17:40:48 compute-1 systemd-sysv-generator[87052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:48 compute-1 systemd-rc-local-generator[87049]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:49 compute-1 systemd[1]: Starting Ceph haproxy.rgw.default.compute-1.adkopy for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e60 e60: 2 total, 2 up, 2 in
Sep 30 17:40:49 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 60 pg[11.0( v 51'32 (0'0,51'32] local-lis/les=50/51 n=8 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=60 pruub=13.660915375s) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 51'31 mlcod 51'31 active pruub 160.095306396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:49 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 60 pg[11.0( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=60 pruub=13.660915375s) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 51'31 mlcod 0'0 unknown pruub 160.095306396s@ mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Sep 30 17:40:49 compute-1 ceph-mon[75484]: osdmap e59: 2 total, 2 up, 2 in
Sep 30 17:40:49 compute-1 ceph-mon[75484]: pgmap v54: 291 pgs: 93 unknown, 198 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:49 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:49 compute-1 ceph-mon[75484]: 8.15 scrub starts
Sep 30 17:40:49 compute-1 ceph-mon[75484]: 8.15 scrub ok
Sep 30 17:40:49 compute-1 podman[87107]: 2025-09-30 17:40:49.540693593 +0000 UTC m=+0.078561324 container create afe5d2dbef44bab1a100330591f8ad5a5985fcc4c5446229feab1e470a0f5162 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-rgw-default-compute-1-adkopy)
Sep 30 17:40:49 compute-1 podman[87107]: 2025-09-30 17:40:49.509387353 +0000 UTC m=+0.047255124 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Sep 30 17:40:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0141ee041ed9855d4e93d017c9c65e570c912ec4a55697719098a6e9138215b7/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:49 compute-1 podman[87107]: 2025-09-30 17:40:49.637346068 +0000 UTC m=+0.175213859 container init afe5d2dbef44bab1a100330591f8ad5a5985fcc4c5446229feab1e470a0f5162 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-rgw-default-compute-1-adkopy)
Sep 30 17:40:49 compute-1 podman[87107]: 2025-09-30 17:40:49.64773487 +0000 UTC m=+0.185602601 container start afe5d2dbef44bab1a100330591f8ad5a5985fcc4c5446229feab1e470a0f5162 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-rgw-default-compute-1-adkopy)
Sep 30 17:40:49 compute-1 bash[87107]: afe5d2dbef44bab1a100330591f8ad5a5985fcc4c5446229feab1e470a0f5162
Sep 30 17:40:49 compute-1 systemd[1]: Started Ceph haproxy.rgw.default.compute-1.adkopy for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-rgw-default-compute-1-adkopy[87122]: [NOTICE] 272/174049 (2) : New worker #1 (4) forked
Sep 30 17:40:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Sep 30 17:40:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Sep 30 17:40:49 compute-1 sudo[86880]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 10.15 scrub starts
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 10.15 scrub ok
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Sep 30 17:40:50 compute-1 ceph-mon[75484]: osdmap e60: 2 total, 2 up, 2 in
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 8.14 scrub starts
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 8.14 scrub ok
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:50 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Sep 30 17:40:50 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Sep 30 17:40:50 compute-1 ceph-mon[75484]: Deploying daemon keepalived.rgw.default.compute-0.fjegxm on compute-0
Sep 30 17:40:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e61 e61: 2 total, 2 up, 2 in
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.17( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.16( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.15( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.14( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.13( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.12( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1( v 51'32 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.c( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.b( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.a( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.9( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.d( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.e( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.f( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.8( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.2( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.3( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.4( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.5( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.6( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.7( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.18( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.19( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1a( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1b( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1c( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1d( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1e( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1f( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.10( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.11( v 51'32 lc 0'0 (0'0,51'32] local-lis/les=50/51 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.17( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.16( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.15( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.12( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.13( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.14( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.0( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 51'31 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.c( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.b( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.a( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.9( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.d( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.e( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.f( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.8( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.2( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.3( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.4( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.7( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.5( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.18( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.19( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1a( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1c( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1b( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1d( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1e( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.1f( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.10( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.11( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 61 pg[11.6( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=50/50 les/c/f=51/51/0 sis=60) [1] r=0 lpr=60 pi=[50,60)/1 crt=51'32 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:40:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:50.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:40:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:50 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0003e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:50 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:40:50.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Sep 30 17:40:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Sep 30 17:40:51 compute-1 ceph-mon[75484]: 10.16 scrub starts
Sep 30 17:40:51 compute-1 ceph-mon[75484]: 10.16 scrub ok
Sep 30 17:40:51 compute-1 ceph-mon[75484]: osdmap e61: 2 total, 2 up, 2 in
Sep 30 17:40:51 compute-1 ceph-mon[75484]: pgmap v57: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:51 compute-1 ceph-mon[75484]: 8.17 scrub starts
Sep 30 17:40:51 compute-1 ceph-mon[75484]: 8.17 scrub ok
Sep 30 17:40:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Sep 30 17:40:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Sep 30 17:40:51 compute-1 sudo[87138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:40:51 compute-1 sudo[87138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:51 compute-1 sudo[87138]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:51 compute-1 sudo[87163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:40:51 compute-1 sudo[87163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.347242747 +0000 UTC m=+0.067300119 container create d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, io.buildah.version=1.28.2, architecture=x86_64, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, distribution-scope=public, vcs-type=git, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph.)
Sep 30 17:40:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:40:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:52.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:40:52 compute-1 systemd[1]: Started libpod-conmon-d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b.scope.
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.313786748 +0000 UTC m=+0.033844210 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Sep 30 17:40:52 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:40:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:52 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.473240068 +0000 UTC m=+0.193297510 container init d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-type=git, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.486684283 +0000 UTC m=+0.206741635 container start d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, name=keepalived, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, release=1793, io.openshift.expose-services=)
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.490477866 +0000 UTC m=+0.210535318 container attach d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, release=1793, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, vendor=Red Hat, Inc.)
Sep 30 17:40:52 compute-1 stoic_tu[87246]: 0 0
Sep 30 17:40:52 compute-1 systemd[1]: libpod-d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b.scope: Deactivated successfully.
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.497892078 +0000 UTC m=+0.217949450 container died d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Sep 30 17:40:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:52 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-332c2ad517ee69ec51f1287b7490a6bf5fdf4871eb97afb2a19b858ef0ed375b-merged.mount: Deactivated successfully.
Sep 30 17:40:52 compute-1 podman[87230]: 2025-09-30 17:40:52.551030881 +0000 UTC m=+0.271088283 container remove d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b (image=quay.io/ceph/keepalived:2.2.4, name=stoic_tu, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, vendor=Red Hat, Inc., io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, architecture=x86_64)
Sep 30 17:40:52 compute-1 systemd[1]: libpod-conmon-d8de96772cfb8bb633de8fb4cd3b5413052734562b0c60200881e57b66c0e01b.scope: Deactivated successfully.
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 10.14 scrub starts
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 10.14 scrub ok
Sep 30 17:40:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:52 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Sep 30 17:40:52 compute-1 ceph-mon[75484]: Deploying daemon keepalived.rgw.default.compute-1.wuqpyu on compute-1
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 8.11 scrub starts
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 8.11 scrub ok
Sep 30 17:40:52 compute-1 ceph-mon[75484]: 10.0 scrub starts
Sep 30 17:40:52 compute-1 systemd[1]: Reloading.
Sep 30 17:40:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:40:52.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.2 deep-scrub starts
Sep 30 17:40:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.2 deep-scrub ok
Sep 30 17:40:52 compute-1 systemd-sysv-generator[87298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:52 compute-1 systemd-rc-local-generator[87295]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:53 compute-1 systemd[1]: Reloading.
Sep 30 17:40:53 compute-1 systemd-rc-local-generator[87335]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:40:53 compute-1 systemd-sysv-generator[87339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:40:53 compute-1 systemd[1]: Starting Ceph keepalived.rgw.default.compute-1.wuqpyu for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:40:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.f scrub starts
Sep 30 17:40:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.f scrub ok
Sep 30 17:40:53 compute-1 podman[87393]: 2025-09-30 17:40:53.749131715 +0000 UTC m=+0.086079338 container create daa9b8ab4677447003f17c6e0368dd48c49c534f8de0cb4a8867500eb0f6c7db (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, distribution-scope=public, architecture=x86_64, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.buildah.version=1.28.2, vcs-type=git, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph)
Sep 30 17:40:53 compute-1 podman[87393]: 2025-09-30 17:40:53.712455629 +0000 UTC m=+0.049403332 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Sep 30 17:40:53 compute-1 ceph-mon[75484]: 10.0 scrub ok
Sep 30 17:40:53 compute-1 ceph-mon[75484]: pgmap v58: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 58 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:53 compute-1 ceph-mon[75484]: 8.2 deep-scrub starts
Sep 30 17:40:53 compute-1 ceph-mon[75484]: 8.2 deep-scrub ok
Sep 30 17:40:53 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:53 compute-1 ceph-mon[75484]: 10.2 scrub starts
Sep 30 17:40:53 compute-1 ceph-mon[75484]: 10.2 scrub ok
Sep 30 17:40:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c209a46fd01a980e20c610378c3e8bd19a84c9c1d46d85217562bb72c77c252a/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:40:53 compute-1 podman[87393]: 2025-09-30 17:40:53.838892203 +0000 UTC m=+0.175839886 container init daa9b8ab4677447003f17c6e0368dd48c49c534f8de0cb4a8867500eb0f6c7db (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, vcs-type=git, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, name=keepalived)
Sep 30 17:40:53 compute-1 podman[87393]: 2025-09-30 17:40:53.849068449 +0000 UTC m=+0.186016082 container start daa9b8ab4677447003f17c6e0368dd48c49c534f8de0cb4a8867500eb0f6c7db (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, version=2.2.4, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, release=1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 17:40:53 compute-1 bash[87393]: daa9b8ab4677447003f17c6e0368dd48c49c534f8de0cb4a8867500eb0f6c7db
Sep 30 17:40:53 compute-1 systemd[1]: Started Ceph keepalived.rgw.default.compute-1.wuqpyu for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Starting Keepalived v2.2.4 (08/21,2021)
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Running on Linux 5.14.0-617.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025 (built for Linux 5.14.0)
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Configuration file /etc/keepalived/keepalived.conf
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Starting VRRP child process, pid=4
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: Startup complete
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: (VI_0) Entering BACKUP STATE (init)
Sep 30 17:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:53 2025: VRRP_Script(check_backend) succeeded
Sep 30 17:40:53 compute-1 sudo[87163]: pam_unix(sudo:session): session closed for user root
Sep 30 17:40:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:40:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:54.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:40:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:54 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:54 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:40:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:40:54.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:40:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Sep 30 17:40:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Sep 30 17:40:54 compute-1 ceph-mon[75484]: 8.f scrub starts
Sep 30 17:40:54 compute-1 ceph-mon[75484]: 8.f scrub ok
Sep 30 17:40:54 compute-1 ceph-mon[75484]: 10.13 scrub starts
Sep 30 17:40:54 compute-1 ceph-mon[75484]: 10.13 scrub ok
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Sep 30 17:40:54 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:40:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e62 e62: 2 total, 2 up, 2 in
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.18( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.19( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.1c( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.1d( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.3( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.8( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.9( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.a( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.e( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.c( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.b( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.7( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.6( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.13( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.12( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[12.10( empty local-lis/les=0/0 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.17( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.267456055s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.511276245s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.17( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.267425537s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.511276245s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.15( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.170869827s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.414718628s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.15( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.170820236s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.414718628s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.16( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.267211914s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.511306763s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.16( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.267194748s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.511306763s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.15( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.957024574s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201187134s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.15( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.957000732s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201187134s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.14( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956802368s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201156616s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.14( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956737518s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201156616s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.17( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174682617s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419204712s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.17( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174633026s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419204712s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.14( v 61'40 (0'0,61'40] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272684097s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 61'39 active pruub 163.517303467s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.14( v 61'40 (0'0,61'40] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272648811s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 0'0 unknown NOTIFY pruub 163.517303467s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.16( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174464226s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419189453s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.16( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174448967s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419189453s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.17( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956315041s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201110840s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.13( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272380829s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.517181396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.13( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272362709s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.517181396s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.17( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956292152s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201110840s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.10( v 59'28 (0'0,59'28] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.950414658s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 59'27 mlcod 59'27 active pruub 167.195297241s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.11( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174563408s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419464111s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.10( v 59'28 (0'0,59'28] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.950377464s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 59'27 mlcod 0'0 unknown NOTIFY pruub 167.195297241s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.11( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174537659s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419464111s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.11( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956004143s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201126099s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.10( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174125671s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419281006s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.10( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.174109459s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419281006s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.11( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955951691s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201126099s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.12( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272027969s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.517242432s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.12( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272006989s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.517242432s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272028923s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.517318726s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.272008896s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.517318726s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.2( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955692291s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201278687s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.2( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955668449s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201278687s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.3( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173747063s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419387817s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.3( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173716545s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419387817s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.e( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173871040s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419677734s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.e( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173843384s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419677734s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.8( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955423355s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201400757s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.9( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955393791s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201522827s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.8( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955400467s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201400757s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.9( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.955368042s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201522827s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.f( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173369408s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.419601440s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.f( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173336029s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.419601440s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.e( v 61'40 (0'0,61'40] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.271255493s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 61'39 active pruub 163.517791748s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.e( v 61'40 (0'0,61'40] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.271213531s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 0'0 unknown NOTIFY pruub 163.517791748s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.d( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.954937935s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201599121s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.f( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.271162033s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.517852783s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.f( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.271144867s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.517852783s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.d( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173389435s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.420120239s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.d( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.173366547s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.420120239s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.d( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.954898834s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201599121s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.3( v 61'40 (0'0,61'40] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270752907s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 61'39 active pruub 163.517913818s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.3( v 61'40 (0'0,61'40] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270717621s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=61'40 lcod 61'39 mlcod 0'0 unknown NOTIFY pruub 163.517913818s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.4( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270715714s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.517959595s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.4( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270695686s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.517959595s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.7( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.178113937s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.425567627s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.a( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.172677040s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.420150757s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.7( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.178092957s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.425567627s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.6( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.172682762s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.420181274s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.6( v 47'12 (0'0,47'12] local-lis/les=58/59 n=1 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.172653198s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.420181274s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.5( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270402908s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518035889s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.5( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.954004288s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201675415s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.5( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270370483s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518035889s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.5( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953983307s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201675415s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.7( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270258904s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518020630s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.7( v 51'32 (0'0,51'32] local-lis/les=60/61 n=1 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270243645s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518020630s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.a( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.172647476s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.420150757s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.4( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953965187s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201858521s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.4( v 54'26 (0'0,54'26] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953943253s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201858521s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.1b( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953859329s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201843262s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.1b( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953824043s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201843262s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1a( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270043373s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518173218s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1a( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.270026207s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518173218s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.18( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177357674s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.425521851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.19( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953663826s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.201843262s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.18( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177333832s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.425521851s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.19( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.953636169s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.201843262s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1b( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269912720s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518249512s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1b( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269897461s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518249512s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.18( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.957265854s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.205673218s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.18( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.957239151s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.205673218s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1c( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269719124s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518173218s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1c( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269701004s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518173218s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1e( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269575119s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518325806s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1e( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.269553185s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518325806s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.1c( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956781387s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.205673218s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.1c( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956757545s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.205673218s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.12( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177180290s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.426193237s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.12( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177149773s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.426193237s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.13( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177098274s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 active pruub 161.426239014s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.12( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956576347s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 active pruub 167.205764771s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[8.12( v 54'26 (0'0,54'26] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=62 pruub=14.956560135s) [0] r=-1 lpr=62 pi=[56,62)/1 crt=54'26 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 167.205764771s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[9.13( v 47'12 (0'0,47'12] local-lis/les=58/59 n=0 ec=58/46 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=9.177038193s) [0] r=-1 lpr=62 pi=[58,62)/1 crt=47'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 161.426239014s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1d( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.268619537s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 active pruub 163.518264771s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 62 pg[11.1d( v 51'32 (0'0,51'32] local-lis/les=60/61 n=0 ec=60/50 lis/c=60/60 les/c/f=61/61/0 sis=62 pruub=11.268586159s) [0] r=-1 lpr=62 pi=[60,62)/1 crt=51'32 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 163.518264771s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Sep 30 17:40:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Sep 30 17:40:55 compute-1 ceph-mon[75484]: Deploying daemon prometheus.compute-0 on compute-0
Sep 30 17:40:55 compute-1 ceph-mon[75484]: pgmap v59: 353 pgs: 353 active+clean; 456 KiB data, 76 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:55 compute-1 ceph-mon[75484]: 8.9 scrub starts
Sep 30 17:40:55 compute-1 ceph-mon[75484]: 8.9 scrub ok
Sep 30 17:40:55 compute-1 ceph-mon[75484]: 10.e scrub starts
Sep 30 17:40:55 compute-1 ceph-mon[75484]: 10.e scrub ok
Sep 30 17:40:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:40:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:40:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:40:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Sep 30 17:40:55 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:40:55 compute-1 ceph-mon[75484]: osdmap e62: 2 total, 2 up, 2 in
Sep 30 17:40:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e63 e63: 2 total, 2 up, 2 in
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.10( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.13( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.12( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.6( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.7( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.c( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.e( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.a( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.9( v 61'1 lc 0'0 (0'0,61'1] local-lis/les=62/63 n=1 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=61'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.8( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.3( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.b( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.1d( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.1c( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.19( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 63 pg[12.18( empty local-lis/les=62/63 n=0 ec=60/52 lis/c=60/60 les/c/f=61/61/0 sis=62) [1] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:56.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:56 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0003e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:56 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0001b40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:40:56.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Sep 30 17:40:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Sep 30 17:40:56 compute-1 ceph-mon[75484]: 9.14 scrub starts
Sep 30 17:40:56 compute-1 ceph-mon[75484]: 9.14 scrub ok
Sep 30 17:40:56 compute-1 ceph-mon[75484]: 10.17 scrub starts
Sep 30 17:40:56 compute-1 ceph-mon[75484]: 10.17 scrub ok
Sep 30 17:40:56 compute-1 ceph-mon[75484]: osdmap e63: 2 total, 2 up, 2 in
Sep 30 17:40:56 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Sep 30 17:40:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e64 e64: 2 total, 2 up, 2 in
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 64 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=64) [1] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Sep 30 17:40:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Sep 30 17:40:57 compute-1 ceph-mon[75484]: pgmap v62: 353 pgs: 353 active+clean; 456 KiB data, 76 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:57 compute-1 ceph-mon[75484]: 11.15 scrub starts
Sep 30 17:40:57 compute-1 ceph-mon[75484]: 11.15 scrub ok
Sep 30 17:40:57 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Sep 30 17:40:57 compute-1 ceph-mon[75484]: osdmap e64: 2 total, 2 up, 2 in
Sep 30 17:40:57 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e65 e65: 2 total, 2 up, 2 in
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=65) [1]/[0] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:40:58.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:58 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001eb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:40:58 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:40:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:40:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:40:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:40:58.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:40:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Sep 30 17:40:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 8.3 scrub starts
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 8.3 scrub ok
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 9.e scrub starts
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 9.e scrub ok
Sep 30 17:40:58 compute-1 ceph-mon[75484]: osdmap e65: 2 total, 2 up, 2 in
Sep 30 17:40:58 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 9.2 scrub starts
Sep 30 17:40:58 compute-1 ceph-mon[75484]: 9.2 scrub ok
Sep 30 17:40:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:40:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e66 e66: 2 total, 2 up, 2 in
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 66 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=66) [1] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:40:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e67 e67: 2 total, 2 up, 2 in
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:40:59 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=67) [1]/[0] r=-1 lpr=67 pi=[58,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:40:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:40:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:40:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:40:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.0 scrub starts
Sep 30 17:40:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.0 scrub ok
Sep 30 17:40:59 compute-1 ceph-mon[75484]: pgmap v65: 353 pgs: 353 active+clean; 456 KiB data, 76 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:40:59 compute-1 ceph-mon[75484]: 9.6 deep-scrub starts
Sep 30 17:40:59 compute-1 ceph-mon[75484]: 9.6 deep-scrub ok
Sep 30 17:40:59 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Sep 30 17:40:59 compute-1 ceph-mon[75484]: osdmap e66: 2 total, 2 up, 2 in
Sep 30 17:40:59 compute-1 ceph-mon[75484]: osdmap e67: 2 total, 2 up, 2 in
Sep 30 17:40:59 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:59 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:59 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' 
Sep 30 17:40:59 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Sep 30 17:40:59 compute-1 ceph-mon[75484]: 11.0 scrub starts
Sep 30 17:40:59 compute-1 ceph-mon[75484]: 11.0 scrub ok
Sep 30 17:41:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e68 e68: 2 total, 2 up, 2 in
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=0/0 n=4 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=0/0 n=4 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.2( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.2( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:00 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 68 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:00.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:00 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0003e00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:00 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0001b40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr handle_mgr_map respawning because set of enabled modules changed!
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr respawn  e: '/usr/bin/ceph-mgr'
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr respawn  0: '/usr/bin/ceph-mgr'
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr respawn  1: '-n'
Sep 30 17:41:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:00.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.c scrub starts
Sep 30 17:41:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.c scrub ok
Sep 30 17:41:00 compute-1 sshd-session[83039]: Connection closed by 192.168.122.100 port 51802
Sep 30 17:41:00 compute-1 sshd-session[83019]: pam_unix(sshd:session): session closed for user ceph-admin
Sep 30 17:41:00 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Sep 30 17:41:00 compute-1 systemd[1]: session-35.scope: Consumed 26.475s CPU time.
Sep 30 17:41:00 compute-1 systemd-logind[789]: Session 35 logged out. Waiting for processes to exit.
Sep 30 17:41:00 compute-1 systemd-logind[789]: Removed session 35.
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setuser ceph since I am not root
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: ignoring --setgroup ceph since I am not root
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: pidfile_write: ignore empty --pid-file
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'alerts'
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:00.880+0000 7fbed4d39140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'balancer'
Sep 30 17:41:00 compute-1 ceph-mon[75484]: 8.1b scrub starts
Sep 30 17:41:00 compute-1 ceph-mon[75484]: 8.1b scrub ok
Sep 30 17:41:00 compute-1 ceph-mon[75484]: osdmap e68: 2 total, 2 up, 2 in
Sep 30 17:41:00 compute-1 ceph-mon[75484]: from='mgr.14372 192.168.122.100:0/240735010' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Sep 30 17:41:00 compute-1 ceph-mon[75484]: mgrmap e25: compute-0.efvthf(active, since 78s), standbys: compute-1.glbusf
Sep 30 17:41:00 compute-1 ceph-mon[75484]: 11.c scrub starts
Sep 30 17:41:00 compute-1 ceph-mon[75484]: 11.c scrub ok
Sep 30 17:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:00.958+0000 7fbed4d39140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Sep 30 17:41:00 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'cephadm'
Sep 30 17:41:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e69 e69: 2 total, 2 up, 2 in
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.13( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.13( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.3( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.3( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=0/0 n=3 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=0/0 n=3 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.2( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=68/69 n=4 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 69 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=65/58 les/c/f=66/59/0 sis=68) [1] r=0 lpr=68 pi=[58,68)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:01 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.b scrub starts
Sep 30 17:41:01 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.b scrub ok
Sep 30 17:41:01 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'crash'
Sep 30 17:41:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:01.781+0000 7fbed4d39140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:41:01 compute-1 ceph-mgr[75792]: mgr[py] Module crash has missing NOTIFY_TYPES member
Sep 30 17:41:01 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'dashboard'
Sep 30 17:41:01 compute-1 ceph-mon[75484]: 8.18 deep-scrub starts
Sep 30 17:41:01 compute-1 ceph-mon[75484]: 8.18 deep-scrub ok
Sep 30 17:41:01 compute-1 ceph-mon[75484]: osdmap e69: 2 total, 2 up, 2 in
Sep 30 17:41:01 compute-1 ceph-mon[75484]: 11.b scrub starts
Sep 30 17:41:01 compute-1 ceph-mon[75484]: 11.b scrub ok
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e70 e70: 2 total, 2 up, 2 in
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.3( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.13( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 70 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=69/70 n=3 ec=58/48 lis/c=67/58 les/c/f=68/59/0 sis=69) [1] r=0 lpr=69 pi=[58,69)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'devicehealth'
Sep 30 17:41:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:02.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:02.419+0000 7fbed4d39140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'diskprediction_local'
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:02 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:02 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]:   from numpy import show_config as show_numpy_config
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:02.591+0000 7fbed4d39140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'influx'
Sep 30 17:41:02 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Sep 30 17:41:02 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:02.657+0000 7fbed4d39140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Module influx has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'insights'
Sep 30 17:41:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:02.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'iostat'
Sep 30 17:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:02.782+0000 7fbed4d39140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Sep 30 17:41:02 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'k8sevents'
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 10.1 scrub starts
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 10.1 scrub ok
Sep 30 17:41:02 compute-1 ceph-mon[75484]: osdmap e70: 2 total, 2 up, 2 in
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 9.9 scrub starts
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 9.9 scrub ok
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 12.11 scrub starts
Sep 30 17:41:02 compute-1 ceph-mon[75484]: 12.11 scrub ok
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'localpool'
Sep 30 17:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mds_autoscaler'
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'mirroring'
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'nfs'
Sep 30 17:41:03 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.a scrub starts
Sep 30 17:41:03 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.a scrub ok
Sep 30 17:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:03.702+0000 7fbed4d39140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'orchestrator'
Sep 30 17:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:03.906+0000 7fbed4d39140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_perf_query'
Sep 30 17:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:03.989+0000 7fbed4d39140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Sep 30 17:41:03 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'osd_support'
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:04.057+0000 7fbed4d39140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'pg_autoscaler'
Sep 30 17:41:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:04.138+0000 7fbed4d39140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'progress'
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:04.211+0000 7fbed4d39140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Module progress has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'prometheus'
Sep 30 17:41:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:04.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:04 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:04 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0001b40 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:04.559+0000 7fbed4d39140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rbd_support'
Sep 30 17:41:04 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.8 deep-scrub starts
Sep 30 17:41:04 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.8 deep-scrub ok
Sep 30 17:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:04.653+0000 7fbed4d39140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'restful'
Sep 30 17:41:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:04.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:04 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rgw'
Sep 30 17:41:04 compute-1 ceph-mon[75484]: 11.a scrub starts
Sep 30 17:41:04 compute-1 ceph-mon[75484]: 11.a scrub ok
Sep 30 17:41:04 compute-1 ceph-mon[75484]: 12.15 deep-scrub starts
Sep 30 17:41:04 compute-1 ceph-mon[75484]: 12.15 deep-scrub ok
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:05.145+0000 7fbed4d39140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'rook'
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:05 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.9 scrub starts
Sep 30 17:41:05 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.9 scrub ok
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:05.763+0000 7fbed4d39140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Module rook has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'selftest'
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:05.834+0000 7fbed4d39140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'snap_schedule'
Sep 30 17:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:05.912+0000 7fbed4d39140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'stats'
Sep 30 17:41:05 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'status'
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 9.8 deep-scrub starts
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 9.8 deep-scrub ok
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 12.4 scrub starts
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 12.4 scrub ok
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 11.9 scrub starts
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 11.9 scrub ok
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 12.f scrub starts
Sep 30 17:41:06 compute-1 ceph-mon[75484]: 12.f scrub ok
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.064+0000 7fbed4d39140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module status has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telegraf'
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.138+0000 7fbed4d39140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'telemetry'
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.296+0000 7fbed4d39140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'test_orchestrator'
Sep 30 17:41:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:06.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:06 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:06 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.534+0000 7fbed4d39140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'volumes'
Sep 30 17:41:06 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.a scrub starts
Sep 30 17:41:06 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.a scrub ok
Sep 30 17:41:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:06.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.804+0000 7fbed4d39140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Loading python module 'zabbix'
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 2025-09-30T17:41:06.875+0000 7fbed4d39140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr load Constructed class from module: dashboard
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: ms_deliver_dispatch: unhandled message 0x560815936340 mon_map magic: 0 from mon.1 v2:192.168.122.101:3300/0
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: mgr load Constructed class from module: prometheus
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [dashboard INFO root] Configured CherryPy, starting engine...
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [dashboard INFO root] Starting engine...
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO root] server_addr: :: server_port: 9283
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO root] Starting engine...
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: [30/Sep/2025:17:41:06] ENGINE Bus STARTING
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO cherrypy.error] [30/Sep/2025:17:41:06] ENGINE Bus STARTING
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: CherryPy Checker:
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: The Application mounted at '' has an empty config.
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: 
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [dashboard INFO root] Engine started...
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: [30/Sep/2025:17:41:06] ENGINE Serving on http://:::9283
Sep 30 17:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mgr-compute-1-glbusf[75788]: [30/Sep/2025:17:41:06] ENGINE Bus STARTED
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO cherrypy.error] [30/Sep/2025:17:41:06] ENGINE Serving on http://:::9283
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO cherrypy.error] [30/Sep/2025:17:41:06] ENGINE Bus STARTED
Sep 30 17:41:06 compute-1 ceph-mgr[75792]: [prometheus INFO root] Engine started.
Sep 30 17:41:07 compute-1 ceph-mon[75484]: 8.a scrub starts
Sep 30 17:41:07 compute-1 ceph-mon[75484]: 8.a scrub ok
Sep 30 17:41:07 compute-1 ceph-mon[75484]: 12.d scrub starts
Sep 30 17:41:07 compute-1 ceph-mon[75484]: 12.d scrub ok
Sep 30 17:41:07 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf restarted
Sep 30 17:41:07 compute-1 ceph-mon[75484]: Standby manager daemon compute-1.glbusf started
Sep 30 17:41:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e71 e71: 2 total, 2 up, 2 in
Sep 30 17:41:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:07 compute-1 sshd-session[87484]: Accepted publickey for ceph-admin from 192.168.122.100 port 35794 ssh2: RSA SHA256:VErvvXRx5E6TZRj2L+dQwgZehzW+L2wAETKKYOgEi0M
Sep 30 17:41:07 compute-1 systemd-logind[789]: New session 37 of user ceph-admin.
Sep 30 17:41:07 compute-1 systemd[1]: Started Session 37 of User ceph-admin.
Sep 30 17:41:07 compute-1 sshd-session[87484]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Sep 30 17:41:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.b scrub starts
Sep 30 17:41:07 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.b scrub ok
Sep 30 17:41:07 compute-1 sudo[87488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:41:07 compute-1 sudo[87488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:07 compute-1 sudo[87488]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:07 compute-1 sudo[87513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:41:07 compute-1 sudo[87513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:08 compute-1 ceph-mon[75484]: mgrmap e26: compute-0.efvthf(active, since 84s), standbys: compute-1.glbusf
Sep 30 17:41:08 compute-1 ceph-mon[75484]: Active manager daemon compute-0.efvthf restarted
Sep 30 17:41:08 compute-1 ceph-mon[75484]: Activating manager daemon compute-0.efvthf
Sep 30 17:41:08 compute-1 ceph-mon[75484]: osdmap e71: 2 total, 2 up, 2 in
Sep 30 17:41:08 compute-1 ceph-mon[75484]: mgrmap e27: compute-0.efvthf(active, starting, since 0.030113s), standbys: compute-1.glbusf
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.vrwlru"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.wibdub"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-0.efvthf", "id": "compute-0.efvthf"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mgr metadata", "who": "compute-1.glbusf", "id": "compute-1.glbusf"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mds metadata"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "mon metadata"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: Manager daemon compute-0.efvthf is now available
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/mirror_snapshot_schedule"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.efvthf/trash_purge_schedule"}]: dispatch
Sep 30 17:41:08 compute-1 ceph-mon[75484]: 9.b scrub starts
Sep 30 17:41:08 compute-1 ceph-mon[75484]: 9.b scrub ok
Sep 30 17:41:08 compute-1 ceph-mon[75484]: 12.5 scrub starts
Sep 30 17:41:08 compute-1 ceph-mon[75484]: 12.5 scrub ok
Sep 30 17:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:08.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:08 compute-1 podman[87611]: 2025-09-30 17:41:08.463696397 +0000 UTC m=+0.077647450 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:08 compute-1 podman[87611]: 2025-09-30 17:41:08.56214654 +0000 UTC m=+0.176097543 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Sep 30 17:41:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.d scrub starts
Sep 30 17:41:08 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.d scrub ok
Sep 30 17:41:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:08.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:09 compute-1 ceph-mon[75484]: mgrmap e28: compute-0.efvthf(active, since 1.04459s), standbys: compute-1.glbusf
Sep 30 17:41:09 compute-1 ceph-mon[75484]: [30/Sep/2025:17:41:08] ENGINE Bus STARTING
Sep 30 17:41:09 compute-1 ceph-mon[75484]: [30/Sep/2025:17:41:08] ENGINE Serving on https://192.168.122.100:7150
Sep 30 17:41:09 compute-1 ceph-mon[75484]: [30/Sep/2025:17:41:08] ENGINE Client ('192.168.122.100', 33390) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Sep 30 17:41:09 compute-1 ceph-mon[75484]: [30/Sep/2025:17:41:08] ENGINE Serving on http://192.168.122.100:8765
Sep 30 17:41:09 compute-1 ceph-mon[75484]: [30/Sep/2025:17:41:08] ENGINE Bus STARTED
Sep 30 17:41:09 compute-1 ceph-mon[75484]: 11.d scrub starts
Sep 30 17:41:09 compute-1 ceph-mon[75484]: 11.d scrub ok
Sep 30 17:41:09 compute-1 ceph-mon[75484]: 12.2 scrub starts
Sep 30 17:41:09 compute-1 ceph-mon[75484]: 12.2 scrub ok
Sep 30 17:41:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:09 compute-1 podman[87732]: 2025-09-30 17:41:09.202597342 +0000 UTC m=+0.079886250 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:41:09 compute-1 podman[87732]: 2025-09-30 17:41:09.212482421 +0000 UTC m=+0.089771329 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:41:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e72 e72: 2 total, 2 up, 2 in
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 72 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=72) [1] r=0 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 72 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=72) [1] r=0 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 72 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=72) [1] r=0 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 72 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=72) [1] r=0 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e73 e73: 2 total, 2 up, 2 in
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 73 pg[10.1c( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=73) [1]/[0] r=-1 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.e scrub starts
Sep 30 17:41:09 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.e scrub ok
Sep 30 17:41:09 compute-1 podman[87824]: 2025-09-30 17:41:09.672539164 +0000 UTC m=+0.085105342 container exec a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 17:41:09 compute-1 podman[87824]: 2025-09-30 17:41:09.681808976 +0000 UTC m=+0.094375104 container exec_died a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:41:10 compute-1 podman[87892]: 2025-09-30 17:41:10.001476357 +0000 UTC m=+0.084660260 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:41:10 compute-1 podman[87892]: 2025-09-30 17:41:10.018135059 +0000 UTC m=+0.101318902 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:10 compute-1 ceph-mon[75484]: pgmap v4: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Sep 30 17:41:10 compute-1 ceph-mon[75484]: mgrmap e29: compute-0.efvthf(active, since 2s), standbys: compute-1.glbusf
Sep 30 17:41:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Sep 30 17:41:10 compute-1 ceph-mon[75484]: osdmap e72: 2 total, 2 up, 2 in
Sep 30 17:41:10 compute-1 ceph-mon[75484]: osdmap e73: 2 total, 2 up, 2 in
Sep 30 17:41:10 compute-1 ceph-mon[75484]: 8.e scrub starts
Sep 30 17:41:10 compute-1 ceph-mon[75484]: 8.e scrub ok
Sep 30 17:41:10 compute-1 ceph-mon[75484]: 12.0 scrub starts
Sep 30 17:41:10 compute-1 ceph-mon[75484]: 12.0 scrub ok
Sep 30 17:41:10 compute-1 podman[87958]: 2025-09-30 17:41:10.361821482 +0000 UTC m=+0.080930318 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, io.buildah.version=1.28.2, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, release=1793)
Sep 30 17:41:10 compute-1 podman[87958]: 2025-09-30 17:41:10.385135015 +0000 UTC m=+0.104243861 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=2.2.4)
Sep 30 17:41:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:10.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:10 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:10 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e74 e74: 2 total, 2 up, 2 in
Sep 30 17:41:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:10.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:10 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.c scrub starts
Sep 30 17:41:10 compute-1 sudo[87513]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:10 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.c scrub ok
Sep 30 17:41:10 compute-1 sudo[88029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:41:10 compute-1 sudo[88029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:10 compute-1 sudo[88029]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:10 compute-1 sudo[88054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:41:10 compute-1 sudo[88054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:11 compute-1 sudo[88054]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:11 compute-1 sudo[88110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:41:11 compute-1 sudo[88110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:11 compute-1 sudo[88110]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:11 compute-1 ceph-mon[75484]: osdmap e74: 2 total, 2 up, 2 in
Sep 30 17:41:11 compute-1 ceph-mon[75484]: 9.c scrub starts
Sep 30 17:41:11 compute-1 ceph-mon[75484]: 9.c scrub ok
Sep 30 17:41:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Sep 30 17:41:11 compute-1 sudo[88135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 17:41:11 compute-1 sudo[88135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.c scrub starts
Sep 30 17:41:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.c scrub ok
Sep 30 17:41:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e75 e75: 2 total, 2 up, 2 in
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.14( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.14( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.4( v 74'784 (0'0,74'784] local-lis/les=0/0 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 luod=0'0 crt=61'780 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.4( v 74'784 (0'0,74'784] local-lis/les=0/0 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=61'780 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:11 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 75 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:12 compute-1 sudo[88135]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 sudo[88179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:41:12 compute-1 sudo[88179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88179]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:12 compute-1 sudo[88204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:41:12 compute-1 sudo[88204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88204]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 sudo[88229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:41:12 compute-1 sudo[88229]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88229]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 sudo[88254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:41:12 compute-1 sudo[88254]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88254]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:12.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:12 compute-1 sudo[88279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:41:12 compute-1 sudo[88279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88279]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:12 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:12 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:12 compute-1 sudo[88328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:41:12 compute-1 sudo[88328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88328]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Sep 30 17:41:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:12.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:12 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Sep 30 17:41:12 compute-1 sudo[88353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new
Sep 30 17:41:12 compute-1 sudo[88353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88353]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 ceph-mon[75484]: pgmap v8: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:12 compute-1 ceph-mon[75484]: 8.c scrub starts
Sep 30 17:41:12 compute-1 ceph-mon[75484]: 8.c scrub ok
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Sep 30 17:41:12 compute-1 ceph-mon[75484]: osdmap e75: 2 total, 2 up, 2 in
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 17:41:12 compute-1 ceph-mon[75484]: mgrmap e30: compute-0.efvthf(active, since 4s), standbys: compute-1.glbusf
Sep 30 17:41:12 compute-1 ceph-mon[75484]: 10.9 scrub starts
Sep 30 17:41:12 compute-1 ceph-mon[75484]: 10.9 scrub ok
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:41:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:41:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e76 e76: 2 total, 2 up, 2 in
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=76) [1]/[0] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.14( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.4( v 74'784 (0'0,74'784] local-lis/les=75/76 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=74'784 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=6 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:12 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 76 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=73/58 les/c/f=74/59/0 sis=75) [1] r=0 lpr=75 pi=[58,75)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:12 compute-1 sudo[88378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Sep 30 17:41:12 compute-1 sudo[88378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88378]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 sudo[88403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:41:12 compute-1 sudo[88403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:12 compute-1 sudo[88403]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:12 compute-1 sudo[88428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:41:12 compute-1 sudo[88428]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88428]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:41:13 compute-1 sudo[88453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88453]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:41:13 compute-1 sudo[88478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:13 compute-1 sudo[88478]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:41:13 compute-1 sudo[88503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88503]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:41:13 compute-1 sudo[88551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88551]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new
Sep 30 17:41:13 compute-1 sudo[88576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88576]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:41:13 compute-1 sudo[88601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88601]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Sep 30 17:41:13 compute-1 sudo[88626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88626]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.b scrub starts
Sep 30 17:41:13 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.b scrub ok
Sep 30 17:41:13 compute-1 sudo[88651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph
Sep 30 17:41:13 compute-1 sudo[88651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88651]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.conf
Sep 30 17:41:13 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.conf
Sep 30 17:41:13 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:41:13 compute-1 ceph-mon[75484]: 11.8 scrub starts
Sep 30 17:41:13 compute-1 ceph-mon[75484]: 11.8 scrub ok
Sep 30 17:41:13 compute-1 ceph-mon[75484]: osdmap e76: 2 total, 2 up, 2 in
Sep 30 17:41:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Sep 30 17:41:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e77 e77: 2 total, 2 up, 2 in
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=68/69 n=4 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.387311935s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 active pruub 182.395050049s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=68/69 n=4 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.387266159s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 182.395050049s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.383290291s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 active pruub 182.392745972s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.383265495s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 182.392745972s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.384777069s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 active pruub 182.395141602s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.384737015s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 182.395141602s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.384192467s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 active pruub 182.395416260s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:13 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 77 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=11.383933067s) [0] r=-1 lpr=77 pi=[68,77)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 182.395416260s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:13 compute-1 sudo[88676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:41:13 compute-1 sudo[88676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88676]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:13 compute-1 sudo[88701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:41:13 compute-1 sudo[88701]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:13 compute-1 sudo[88701]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[88727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[88727]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88727]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:14 compute-1 sudo[88775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[88775]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88775]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[88800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[88800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88800]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[88825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 sudo[88825]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88825]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:14 compute-1 sudo[88850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:41:14 compute-1 sudo[88850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88850]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:14 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:14 compute-1 sudo[88875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config
Sep 30 17:41:14 compute-1 sudo[88875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88875]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:14 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:14 compute-1 sudo[88902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[88902]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88902]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[88928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b
Sep 30 17:41:14 compute-1 sudo[88928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88928]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e78 e78: 2 total, 2 up, 2 in
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.5( v 77'780 (0'0,77'780] local-lis/les=0/0 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=59'776 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.5( v 77'780 (0'0,77'780] local-lis/les=0/0 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=59'776 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.15( v 54'774 (0'0,54'774] local-lis/les=0/0 n=4 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.15( v 54'774 (0'0,54'774] local-lis/les=0/0 n=4 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=68/69 n=4 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:14 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 78 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=68/69 n=4 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.2 scrub starts
Sep 30 17:41:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:14.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:14 compute-1 sudo[88953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.2 scrub ok
Sep 30 17:41:14 compute-1 sudo[88953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[88953]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.conf
Sep 30 17:41:14 compute-1 ceph-mon[75484]: 10.8 scrub starts
Sep 30 17:41:14 compute-1 ceph-mon[75484]: 10.8 scrub ok
Sep 30 17:41:14 compute-1 ceph-mon[75484]: pgmap v11: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:14 compute-1 ceph-mon[75484]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 ceph-mon[75484]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 ceph-mon[75484]: Updating compute-0:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 ceph-mon[75484]: 8.b scrub starts
Sep 30 17:41:14 compute-1 ceph-mon[75484]: 8.b scrub ok
Sep 30 17:41:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Sep 30 17:41:14 compute-1 ceph-mon[75484]: osdmap e77: 2 total, 2 up, 2 in
Sep 30 17:41:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:14 compute-1 ceph-mon[75484]: Updating compute-1:/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 ceph-mon[75484]: osdmap e78: 2 total, 2 up, 2 in
Sep 30 17:41:14 compute-1 sudo[89001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[89001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[89001]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[89026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new
Sep 30 17:41:14 compute-1 sudo[89026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[89026]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:14 compute-1 sudo[89051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-63d32c6a-fa18-54ed-8711-9a3915cc367b/var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring.new /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/config/ceph.client.admin.keyring
Sep 30 17:41:14 compute-1 sudo[89051]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:14 compute-1 sudo[89051]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1 scrub starts
Sep 30 17:41:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e79 e79: 2 total, 2 up, 2 in
Sep 30 17:41:15 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1 scrub ok
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.15( v 54'774 (0'0,54'774] local-lis/les=78/79 n=4 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.5( v 77'780 (0'0,77'780] local-lis/les=78/79 n=6 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=77'780 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=76/58 les/c/f=77/59/0 sis=78) [1] r=0 lpr=78 pi=[58,78)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-mon[75484]: 11.2 scrub starts
Sep 30 17:41:15 compute-1 ceph-mon[75484]: 11.2 scrub ok
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:41:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:41:15 compute-1 ceph-mon[75484]: pgmap v14: 353 pgs: 4 remapped+peering, 349 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 116 B/s, 6 objects/s recovering
Sep 30 17:41:15 compute-1 ceph-mon[75484]: osdmap e79: 2 total, 2 up, 2 in
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] async=[0] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=78/79 n=4 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] async=[0] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] async=[0] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:15 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 79 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=78) [0]/[1] async=[0] r=0 lpr=78 pi=[68,78)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:16 compute-1 sshd-session[89077]: Accepted publickey for zuul from 192.168.122.30 port 50026 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:41:16 compute-1 systemd-logind[789]: New session 38 of user zuul.
Sep 30 17:41:16 compute-1 systemd[1]: Started Session 38 of User zuul.
Sep 30 17:41:16 compute-1 sshd-session[89077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:16.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:16 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:16 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.0 scrub starts
Sep 30 17:41:16 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.0 scrub ok
Sep 30 17:41:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:16.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:16 compute-1 ceph-mon[75484]: 8.1 scrub starts
Sep 30 17:41:16 compute-1 ceph-mon[75484]: 8.1 scrub ok
Sep 30 17:41:16 compute-1 ceph-mon[75484]: 12.1 scrub starts
Sep 30 17:41:16 compute-1 ceph-mon[75484]: 12.1 scrub ok
Sep 30 17:41:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e80 e80: 2 total, 2 up, 2 in
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=78/79 n=4 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.847857475s) [0] async=[0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 54'774 active pruub 189.100540161s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.847846031s) [0] async=[0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 54'774 active pruub 189.100585938s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.16( v 54'774 (0'0,54'774] local-lis/les=78/79 n=4 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.847784996s) [0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 189.100540161s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.847379684s) [0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 189.100585938s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.846900940s) [0] async=[0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 54'774 active pruub 189.100753784s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.846854210s) [0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 189.100753784s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.842604637s) [0] async=[0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 54'774 active pruub 189.096817017s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:17 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 80 pg[10.6( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.842554092s) [0] r=-1 lpr=80 pi=[68,80)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 189.096817017s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:17 compute-1 python3.9[89231]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:41:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Sep 30 17:41:17 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Sep 30 17:41:17 compute-1 ceph-mon[75484]: 9.0 scrub starts
Sep 30 17:41:17 compute-1 ceph-mon[75484]: 9.0 scrub ok
Sep 30 17:41:17 compute-1 ceph-mon[75484]: 12.1e scrub starts
Sep 30 17:41:17 compute-1 ceph-mon[75484]: 12.1e scrub ok
Sep 30 17:41:17 compute-1 ceph-mon[75484]: osdmap e80: 2 total, 2 up, 2 in
Sep 30 17:41:17 compute-1 ceph-mon[75484]: pgmap v17: 353 pgs: 4 remapped+peering, 349 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 116 B/s, 6 objects/s recovering
Sep 30 17:41:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e81 e81: 2 total, 2 up, 2 in
Sep 30 17:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:18.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:18 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:18 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc001370 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1 scrub starts
Sep 30 17:41:18 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1 scrub ok
Sep 30 17:41:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:18.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:18 compute-1 ceph-mon[75484]: 8.0 scrub starts
Sep 30 17:41:18 compute-1 ceph-mon[75484]: 8.0 scrub ok
Sep 30 17:41:18 compute-1 ceph-mon[75484]: osdmap e81: 2 total, 2 up, 2 in
Sep 30 17:41:18 compute-1 sudo[89445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppytctawaxmkrldbwbmtqdogbksvzaex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254078.4199417-45-71713038210692/AnsiballZ_command.py'
Sep 30 17:41:18 compute-1 sudo[89445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:19 compute-1 python3.9[89447]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:41:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:19 compute-1 sudo[89456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:41:19 compute-1 sudo[89456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:19 compute-1 sudo[89456]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:19 compute-1 sudo[89481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:41:19 compute-1 sudo[89481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:19 compute-1 sudo[89481]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Sep 30 17:41:19 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Sep 30 17:41:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e82 e82: 2 total, 2 up, 2 in
Sep 30 17:41:19 compute-1 ceph-mon[75484]: 12.1f deep-scrub starts
Sep 30 17:41:19 compute-1 ceph-mon[75484]: 12.1f deep-scrub ok
Sep 30 17:41:19 compute-1 ceph-mon[75484]: 9.1 scrub starts
Sep 30 17:41:19 compute-1 ceph-mon[75484]: 9.1 scrub ok
Sep 30 17:41:19 compute-1 ceph-mon[75484]: pgmap v19: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 180 B/s, 9 objects/s recovering
Sep 30 17:41:19 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Sep 30 17:41:19 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:19 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:19 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:19 compute-1 ceph-mon[75484]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Sep 30 17:41:19 compute-1 ceph-mon[75484]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Sep 30 17:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:20.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:20 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:20 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Sep 30 17:41:20 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Sep 30 17:41:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:20.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:20 compute-1 ceph-mon[75484]: 12.1a deep-scrub starts
Sep 30 17:41:20 compute-1 ceph-mon[75484]: 12.1a deep-scrub ok
Sep 30 17:41:20 compute-1 ceph-mon[75484]: 8.7 scrub starts
Sep 30 17:41:20 compute-1 ceph-mon[75484]: 8.7 scrub ok
Sep 30 17:41:20 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Sep 30 17:41:20 compute-1 ceph-mon[75484]: osdmap e82: 2 total, 2 up, 2 in
Sep 30 17:41:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Sep 30 17:41:21 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 12.1b scrub starts
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 12.1b scrub ok
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 8.6 scrub starts
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 8.6 scrub ok
Sep 30 17:41:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:21 compute-1 ceph-mon[75484]: pgmap v21: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 147 B/s, 8 objects/s recovering
Sep 30 17:41:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Sep 30 17:41:21 compute-1 ceph-mon[75484]: Reconfiguring grafana.compute-0 (dependencies changed)...
Sep 30 17:41:21 compute-1 ceph-mon[75484]: Reconfiguring daemon grafana.compute-0 on compute-0
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 11.6 scrub starts
Sep 30 17:41:21 compute-1 ceph-mon[75484]: 11.6 scrub ok
Sep 30 17:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e83 e83: 2 total, 2 up, 2 in
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 82 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=69/70 n=3 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.976106644s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 active pruub 191.419540405s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=69/70 n=3 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.975496292s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 191.419540405s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 82 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.974752426s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 active pruub 191.419433594s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.974693298s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 191.419433594s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 82 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.974191666s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 active pruub 191.419433594s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.974163055s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 191.419433594s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 82 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.969627380s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 active pruub 191.415863037s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=82 pruub=11.969425201s) [0] r=-1 lpr=82 pi=[69,82)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 191.415863037s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=83) [1] r=0 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:22 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 83 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=83) [1] r=0 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:22.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc002390 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Sep 30 17:41:22 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Sep 30 17:41:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:22.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:22 compute-1 ceph-mon[75484]: 12.16 scrub starts
Sep 30 17:41:22 compute-1 ceph-mon[75484]: 12.16 scrub ok
Sep 30 17:41:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Sep 30 17:41:22 compute-1 ceph-mon[75484]: osdmap e83: 2 total, 2 up, 2 in
Sep 30 17:41:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:41:22 compute-1 ceph-mon[75484]: 9.4 scrub starts
Sep 30 17:41:22 compute-1 ceph-mon[75484]: 9.4 scrub ok
Sep 30 17:41:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e84 e84: 2 total, 2 up, 2 in
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=69/70 n=3 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=69/70 n=3 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 84 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Sep 30 17:41:23 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Sep 30 17:41:23 compute-1 ceph-mon[75484]: 12.17 scrub starts
Sep 30 17:41:23 compute-1 ceph-mon[75484]: 12.17 scrub ok
Sep 30 17:41:23 compute-1 ceph-mon[75484]: pgmap v23: 353 pgs: 353 active+clean; 458 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 135 B/s, 7 objects/s recovering
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Sep 30 17:41:23 compute-1 ceph-mon[75484]: osdmap e84: 2 total, 2 up, 2 in
Sep 30 17:41:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:23 compute-1 ceph-mon[75484]: 9.5 scrub starts
Sep 30 17:41:23 compute-1 ceph-mon[75484]: 9.5 scrub ok
Sep 30 17:41:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e85 e85: 2 total, 2 up, 2 in
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 85 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] async=[0] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 85 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] async=[0] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 85 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] async=[0] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 85 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=84/85 n=3 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=84) [0]/[1] async=[0] r=0 lpr=84 pi=[69,84)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:24.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c80020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Sep 30 17:41:24 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Sep 30 17:41:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e86 e86: 2 total, 2 up, 2 in
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=84/85 n=3 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.645328522s) [0] async=[0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 54'774 active pruub 197.501907349s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.17( v 54'774 (0'0,54'774] local-lis/les=84/85 n=3 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.645258904s) [0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 197.501907349s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.8( v 54'774 (0'0,54'774] local-lis/les=0/0 n=7 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.8( v 54'774 (0'0,54'774] local-lis/les=0/0 n=7 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.640567780s) [0] async=[0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 54'774 active pruub 197.498428345s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.640506744s) [0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 197.498428345s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.639783859s) [0] async=[0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 54'774 active pruub 197.498443604s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.7( v 54'774 (0'0,54'774] local-lis/les=84/85 n=6 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.639715195s) [0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 197.498443604s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.18( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.18( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=5 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.640734673s) [0] async=[0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 54'774 active pruub 197.500274658s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 86 pg[10.1f( v 54'774 (0'0,54'774] local-lis/les=84/85 n=5 ec=58/48 lis/c=84/69 les/c/f=85/70/0 sis=86 pruub=15.640669823s) [0] r=-1 lpr=86 pi=[69,86)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 197.500274658s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.665158) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084665322, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7755, "num_deletes": 254, "total_data_size": 19902910, "memory_usage": 20824272, "flush_reason": "Manual Compaction"}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Sep 30 17:41:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:24.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084730035, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12193373, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 7760, "table_properties": {"data_size": 12164722, "index_size": 18488, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9093, "raw_key_size": 88267, "raw_average_key_size": 24, "raw_value_size": 12094289, "raw_average_value_size": 3329, "num_data_blocks": 816, "num_entries": 3633, "num_filter_entries": 3633, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 1759253879, "file_creation_time": 1759254084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 64943 microseconds, and 30042 cpu microseconds.
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.730105) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12193373 bytes OK
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.730157) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.732129) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.732194) EVENT_LOG_v1 {"time_micros": 1759254084732180, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.732256) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19863635, prev total WAL file size 19863635, number of live WAL files 2.
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.737095) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1648B)]
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084737247, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12195021, "oldest_snapshot_seqno": -1}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3383 keys, 12189948 bytes, temperature: kUnknown
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084803700, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12189948, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12161899, "index_size": 18467, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8517, "raw_key_size": 84856, "raw_average_key_size": 25, "raw_value_size": 12094569, "raw_average_value_size": 3575, "num_data_blocks": 816, "num_entries": 3383, "num_filter_entries": 3383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.804014) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12189948 bytes
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.805296) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.3 rd, 183.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.6, 0.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3638, records dropped: 255 output_compression: NoCompression
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.805324) EVENT_LOG_v1 {"time_micros": 1759254084805309, "job": 4, "event": "compaction_finished", "compaction_time_micros": 66543, "compaction_time_cpu_micros": 30005, "output_level": 6, "num_output_files": 1, "total_output_size": 12189948, "num_input_records": 3638, "num_output_records": 3383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084807972, "job": 4, "event": "table_file_deletion", "file_number": 14}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254084808032, "job": 4, "event": "table_file_deletion", "file_number": 8}
Sep 30 17:41:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:24.736896) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 12.14 scrub starts
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 12.14 scrub ok
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 11.17 scrub starts
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 11.17 scrub ok
Sep 30 17:41:24 compute-1 ceph-mon[75484]: osdmap e85: 2 total, 2 up, 2 in
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 11.18 scrub starts
Sep 30 17:41:24 compute-1 ceph-mon[75484]: 11.18 scrub ok
Sep 30 17:41:24 compute-1 ceph-mon[75484]: osdmap e86: 2 total, 2 up, 2 in
Sep 30 17:41:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Sep 30 17:41:25 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Sep 30 17:41:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e87 e87: 2 total, 2 up, 2 in
Sep 30 17:41:25 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 87 pg[10.18( v 54'774 (0'0,54'774] local-lis/les=86/87 n=5 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:25 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 87 pg[10.8( v 54'774 (0'0,54'774] local-lis/les=86/87 n=7 ec=58/48 lis/c=84/58 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[58,86)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:25 compute-1 sudo[89445]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:26 compute-1 sshd-session[89080]: Connection closed by 192.168.122.30 port 50026
Sep 30 17:41:26 compute-1 sshd-session[89077]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Sep 30 17:41:26 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.362429) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086363150, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 307, "num_deletes": 251, "total_data_size": 214855, "memory_usage": 222200, "flush_reason": "Manual Compaction"}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Sep 30 17:41:26 compute-1 systemd[1]: session-38.scope: Consumed 8.472s CPU time.
Sep 30 17:41:26 compute-1 systemd-logind[789]: Session 38 logged out. Waiting for processes to exit.
Sep 30 17:41:26 compute-1 systemd-logind[789]: Removed session 38.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086367646, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 142254, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7765, "largest_seqno": 8067, "table_properties": {"data_size": 140170, "index_size": 248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5019, "raw_average_key_size": 17, "raw_value_size": 136083, "raw_average_value_size": 477, "num_data_blocks": 10, "num_entries": 285, "num_filter_entries": 285, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254084, "oldest_key_time": 1759254084, "file_creation_time": 1759254086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5327 microseconds, and 1412 cpu microseconds.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.367774) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 142254 bytes OK
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.367818) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.375007) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.375024) EVENT_LOG_v1 {"time_micros": 1759254086375019, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.375044) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 212611, prev total WAL file size 212900, number of live WAL files 2.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.376517) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(138KB)], [15(11MB)]
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086376556, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12332202, "oldest_snapshot_seqno": -1}
Sep 30 17:41:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:26.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3152 keys, 11095022 bytes, temperature: kUnknown
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086437230, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11095022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11069348, "index_size": 16668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7941, "raw_key_size": 81133, "raw_average_key_size": 25, "raw_value_size": 11006725, "raw_average_value_size": 3491, "num_data_blocks": 730, "num_entries": 3152, "num_filter_entries": 3152, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254086, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.437705) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11095022 bytes
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.442763) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.6 rd, 182.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.6 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(164.7) write-amplify(78.0) OK, records in: 3668, records dropped: 516 output_compression: NoCompression
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.442814) EVENT_LOG_v1 {"time_micros": 1759254086442794, "job": 6, "event": "compaction_finished", "compaction_time_micros": 60858, "compaction_time_cpu_micros": 22822, "output_level": 6, "num_output_files": 1, "total_output_size": 11095022, "num_input_records": 3668, "num_output_records": 3152, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086443222, "job": 6, "event": "table_file_deletion", "file_number": 17}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254086445807, "job": 6, "event": "table_file_deletion", "file_number": 15}
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.376429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.445870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.445876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.445878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.445879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:41:26.445881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Sep 30 17:41:26 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Sep 30 17:41:26 compute-1 ceph-mon[75484]: 9.15 scrub starts
Sep 30 17:41:26 compute-1 ceph-mon[75484]: 9.15 scrub ok
Sep 30 17:41:26 compute-1 ceph-mon[75484]: pgmap v27: 353 pgs: 6 unknown, 347 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:26 compute-1 ceph-mon[75484]: 9.1a scrub starts
Sep 30 17:41:26 compute-1 ceph-mon[75484]: 9.1a scrub ok
Sep 30 17:41:26 compute-1 ceph-mon[75484]: osdmap e87: 2 total, 2 up, 2 in
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:41:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:41:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:26.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1b deep-scrub starts
Sep 30 17:41:27 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1b deep-scrub ok
Sep 30 17:41:27 compute-1 ceph-mon[75484]: 11.16 scrub starts
Sep 30 17:41:27 compute-1 ceph-mon[75484]: 11.16 scrub ok
Sep 30 17:41:27 compute-1 ceph-mon[75484]: 11.19 scrub starts
Sep 30 17:41:27 compute-1 ceph-mon[75484]: 11.19 scrub ok
Sep 30 17:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:28.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc002390 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1a deep-scrub starts
Sep 30 17:41:28 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1a deep-scrub ok
Sep 30 17:41:28 compute-1 ceph-mon[75484]: 9.16 scrub starts
Sep 30 17:41:28 compute-1 ceph-mon[75484]: 9.16 scrub ok
Sep 30 17:41:28 compute-1 ceph-mon[75484]: pgmap v29: 353 pgs: 6 unknown, 347 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:28 compute-1 ceph-mon[75484]: 9.1b deep-scrub starts
Sep 30 17:41:28 compute-1 ceph-mon[75484]: 9.1b deep-scrub ok
Sep 30 17:41:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:28.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Sep 30 17:41:29 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Sep 30 17:41:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e88 e88: 2 total, 2 up, 2 in
Sep 30 17:41:29 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 88 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=88 pruub=11.457120895s) [0] r=-1 lpr=88 pi=[68,88)/1 crt=54'774 mlcod 0'0 active pruub 198.395477295s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:29 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 88 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=88 pruub=11.456860542s) [0] r=-1 lpr=88 pi=[68,88)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 198.395477295s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:29 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 88 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=88 pruub=11.456238747s) [0] r=-1 lpr=88 pi=[68,88)/1 crt=54'774 mlcod 0'0 active pruub 198.395660400s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:29 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 88 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=88 pruub=11.456183434s) [0] r=-1 lpr=88 pi=[68,88)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 198.395660400s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:29 compute-1 ceph-mon[75484]: 11.13 scrub starts
Sep 30 17:41:29 compute-1 ceph-mon[75484]: 11.13 scrub ok
Sep 30 17:41:29 compute-1 ceph-mon[75484]: 8.1a deep-scrub starts
Sep 30 17:41:29 compute-1 ceph-mon[75484]: 8.1a deep-scrub ok
Sep 30 17:41:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Sep 30 17:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:30.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001230 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Sep 30 17:41:30 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Sep 30 17:41:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:30.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:30 compute-1 ceph-mon[75484]: 9.17 scrub starts
Sep 30 17:41:30 compute-1 ceph-mon[75484]: 9.17 scrub ok
Sep 30 17:41:30 compute-1 ceph-mon[75484]: pgmap v30: 353 pgs: 353 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 520 B/s wr, 44 op/s; 223 B/s, 8 objects/s recovering
Sep 30 17:41:30 compute-1 ceph-mon[75484]: 9.19 scrub starts
Sep 30 17:41:30 compute-1 ceph-mon[75484]: 9.19 scrub ok
Sep 30 17:41:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Sep 30 17:41:30 compute-1 ceph-mon[75484]: osdmap e88: 2 total, 2 up, 2 in
Sep 30 17:41:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:41:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e89 e89: 2 total, 2 up, 2 in
Sep 30 17:41:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 89 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 89 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 89 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 89 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=68/69 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:30 compute-1 sudo[89570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:41:30 compute-1 sudo[89570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:30 compute-1 sudo[89570]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1f deep-scrub starts
Sep 30 17:41:31 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1f deep-scrub ok
Sep 30 17:41:31 compute-1 ceph-mon[75484]: 11.12 deep-scrub starts
Sep 30 17:41:31 compute-1 ceph-mon[75484]: 11.12 deep-scrub ok
Sep 30 17:41:31 compute-1 ceph-mon[75484]: 9.1e scrub starts
Sep 30 17:41:31 compute-1 ceph-mon[75484]: 9.1e scrub ok
Sep 30 17:41:31 compute-1 ceph-mon[75484]: osdmap e89: 2 total, 2 up, 2 in
Sep 30 17:41:31 compute-1 ceph-mon[75484]: pgmap v33: 353 pgs: 353 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 22 KiB/s rd, 511 B/s wr, 43 op/s; 219 B/s, 8 objects/s recovering
Sep 30 17:41:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Sep 30 17:41:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e90 e90: 2 total, 2 up, 2 in
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=90 pruub=10.328371048s) [0] r=-1 lpr=90 pi=[69,90)/1 crt=54'774 mlcod 0'0 active pruub 199.419738770s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=90 pruub=10.328314781s) [0] r=-1 lpr=90 pi=[69,90)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 199.419738770s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=90 pruub=10.323743820s) [0] r=-1 lpr=90 pi=[69,90)/1 crt=54'774 mlcod 0'0 active pruub 199.416152954s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=90 pruub=10.323708534s) [0] r=-1 lpr=90 pi=[69,90)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 199.416152954s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] async=[0] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:31 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 90 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=5 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=89) [0]/[1] async=[0] r=0 lpr=89 pi=[68,89)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:32.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:32 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc002390 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:32 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1f scrub starts
Sep 30 17:41:32 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1f scrub ok
Sep 30 17:41:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:32.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:32 compute-1 ceph-mon[75484]: 9.10 scrub starts
Sep 30 17:41:32 compute-1 ceph-mon[75484]: 9.10 scrub ok
Sep 30 17:41:32 compute-1 ceph-mon[75484]: 8.1f deep-scrub starts
Sep 30 17:41:32 compute-1 ceph-mon[75484]: 8.1f deep-scrub ok
Sep 30 17:41:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Sep 30 17:41:32 compute-1 ceph-mon[75484]: osdmap e90: 2 total, 2 up, 2 in
Sep 30 17:41:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e91 e91: 2 total, 2 up, 2 in
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=5 ec=58/48 lis/c=89/68 les/c/f=90/69/0 sis=91 pruub=15.044114113s) [0] async=[0] r=-1 lpr=91 pi=[68,91)/1 crt=54'774 mlcod 54'774 active pruub 205.102233887s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.1a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=5 ec=58/48 lis/c=89/68 les/c/f=90/69/0 sis=91 pruub=15.044024467s) [0] r=-1 lpr=91 pi=[68,91)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 205.102233887s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=69/70 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=6 ec=58/48 lis/c=89/68 les/c/f=90/69/0 sis=91 pruub=15.041057587s) [0] async=[0] r=-1 lpr=91 pi=[68,91)/1 crt=54'774 mlcod 54'774 active pruub 205.101013184s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 91 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=89/90 n=6 ec=58/48 lis/c=89/68 les/c/f=90/69/0 sis=91 pruub=15.040909767s) [0] r=-1 lpr=91 pi=[68,91)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 205.101013184s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1e scrub starts
Sep 30 17:41:33 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1e scrub ok
Sep 30 17:41:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e92 e92: 2 total, 2 up, 2 in
Sep 30 17:41:34 compute-1 ceph-mon[75484]: 9.1f scrub starts
Sep 30 17:41:34 compute-1 ceph-mon[75484]: 9.1f scrub ok
Sep 30 17:41:34 compute-1 ceph-mon[75484]: osdmap e91: 2 total, 2 up, 2 in
Sep 30 17:41:34 compute-1 ceph-mon[75484]: pgmap v36: 353 pgs: 353 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Sep 30 17:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=6 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=92 pruub=10.548857689s) [0] r=-1 lpr=92 pi=[75,92)/1 crt=54'774 mlcod 0'0 active pruub 201.989730835s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=6 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=92 pruub=10.548791885s) [0] r=-1 lpr=92 pi=[75,92)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 201.989730835s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=92 pruub=10.549419403s) [0] r=-1 lpr=92 pi=[75,92)/1 crt=54'774 mlcod 0'0 active pruub 201.992111206s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=92 pruub=10.549170494s) [0] r=-1 lpr=92 pi=[75,92)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 201.992111206s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=6 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] async=[0] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 92 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=5 ec=58/48 lis/c=69/69 les/c/f=70/70/0 sis=91) [0]/[1] async=[0] r=0 lpr=91 pi=[69,91)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:34.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:34 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:34 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Sep 30 17:41:34 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Sep 30 17:41:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e93 e93: 2 total, 2 up, 2 in
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=6 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=6 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=6 ec=58/48 lis/c=91/69 les/c/f=92/70/0 sis=93 pruub=15.572223663s) [0] async=[0] r=-1 lpr=93 pi=[69,93)/1 crt=54'774 mlcod 54'774 active pruub 207.447738647s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=6 ec=58/48 lis/c=91/69 les/c/f=92/70/0 sis=93 pruub=15.572134018s) [0] r=-1 lpr=93 pi=[69,93)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 207.447738647s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=5 ec=58/48 lis/c=91/69 les/c/f=92/70/0 sis=93 pruub=15.604566574s) [0] async=[0] r=-1 lpr=93 pi=[69,93)/1 crt=54'774 mlcod 54'774 active pruub 207.480300903s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=91/92 n=5 ec=58/48 lis/c=91/69 les/c/f=92/70/0 sis=93 pruub=15.604520798s) [0] r=-1 lpr=93 pi=[69,93)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 207.480300903s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:34 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 93 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=75/76 n=5 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:34.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:35 compute-1 ceph-mon[75484]: 8.1e scrub starts
Sep 30 17:41:35 compute-1 ceph-mon[75484]: 8.1e scrub ok
Sep 30 17:41:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Sep 30 17:41:35 compute-1 ceph-mon[75484]: osdmap e92: 2 total, 2 up, 2 in
Sep 30 17:41:35 compute-1 ceph-mon[75484]: 9.1c scrub starts
Sep 30 17:41:35 compute-1 ceph-mon[75484]: 9.1c scrub ok
Sep 30 17:41:35 compute-1 ceph-mon[75484]: osdmap e93: 2 total, 2 up, 2 in
Sep 30 17:41:35 compute-1 ceph-mon[75484]: 10.f scrub starts
Sep 30 17:41:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1d deep-scrub starts
Sep 30 17:41:35 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.1d deep-scrub ok
Sep 30 17:41:35 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e94 e94: 2 total, 2 up, 2 in
Sep 30 17:41:35 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 94 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=5 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] async=[0] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:35 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 94 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=6 ec=58/48 lis/c=75/75 les/c/f=76/76/0 sis=93) [0]/[1] async=[0] r=0 lpr=93 pi=[75,93)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:36.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:36 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80020d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:36 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003150 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Sep 30 17:41:36 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Sep 30 17:41:36 compute-1 ceph-mon[75484]: 10.f scrub ok
Sep 30 17:41:36 compute-1 ceph-mon[75484]: pgmap v39: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 54 B/s, 3 objects/s recovering
Sep 30 17:41:36 compute-1 ceph-mon[75484]: 8.1d deep-scrub starts
Sep 30 17:41:36 compute-1 ceph-mon[75484]: 8.1d deep-scrub ok
Sep 30 17:41:36 compute-1 ceph-mon[75484]: osdmap e94: 2 total, 2 up, 2 in
Sep 30 17:41:36 compute-1 ceph-mon[75484]: 10.6 scrub starts
Sep 30 17:41:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e95 e95: 2 total, 2 up, 2 in
Sep 30 17:41:36 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 95 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=6 ec=58/48 lis/c=93/75 les/c/f=94/76/0 sis=95 pruub=14.990369797s) [0] async=[0] r=-1 lpr=95 pi=[75,95)/1 crt=54'774 mlcod 54'774 active pruub 208.903488159s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:36 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 95 pg[10.c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=6 ec=58/48 lis/c=93/75 les/c/f=94/76/0 sis=95 pruub=14.990260124s) [0] r=-1 lpr=95 pi=[75,95)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 208.903488159s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:36 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 95 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=5 ec=58/48 lis/c=93/75 les/c/f=94/76/0 sis=95 pruub=14.981908798s) [0] async=[0] r=-1 lpr=95 pi=[75,95)/1 crt=54'774 mlcod 54'774 active pruub 208.897552490s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:36 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 95 pg[10.1c( v 54'774 (0'0,54'774] local-lis/les=93/94 n=5 ec=58/48 lis/c=93/75 les/c/f=94/76/0 sis=95 pruub=14.981801033s) [0] r=-1 lpr=95 pi=[75,95)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 208.897552490s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:36.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.1f deep-scrub starts
Sep 30 17:41:37 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.1f deep-scrub ok
Sep 30 17:41:37 compute-1 ceph-mon[75484]: 10.6 scrub ok
Sep 30 17:41:37 compute-1 ceph-mon[75484]: 9.1d scrub starts
Sep 30 17:41:37 compute-1 ceph-mon[75484]: 9.1d scrub ok
Sep 30 17:41:37 compute-1 ceph-mon[75484]: osdmap e95: 2 total, 2 up, 2 in
Sep 30 17:41:37 compute-1 ceph-mon[75484]: 10.7 scrub starts
Sep 30 17:41:37 compute-1 ceph-mon[75484]: 10.7 scrub ok
Sep 30 17:41:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:41:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e96 e96: 2 total, 2 up, 2 in
Sep 30 17:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:38.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:38 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:38 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.10 scrub starts
Sep 30 17:41:38 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.10 scrub ok
Sep 30 17:41:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:38.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:38 compute-1 ceph-mon[75484]: pgmap v42: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 98 MiB used, 40 GiB / 40 GiB avail; 54 B/s, 3 objects/s recovering
Sep 30 17:41:38 compute-1 ceph-mon[75484]: 11.1f deep-scrub starts
Sep 30 17:41:38 compute-1 ceph-mon[75484]: 11.1f deep-scrub ok
Sep 30 17:41:38 compute-1 ceph-mon[75484]: osdmap e96: 2 total, 2 up, 2 in
Sep 30 17:41:38 compute-1 ceph-mon[75484]: 9.3 scrub starts
Sep 30 17:41:38 compute-1 ceph-mon[75484]: 9.3 scrub ok
Sep 30 17:41:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:39 compute-1 sudo[89604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:41:39 compute-1 sudo[89604]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:39 compute-1 sudo[89604]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.13 deep-scrub starts
Sep 30 17:41:39 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 8.13 deep-scrub ok
Sep 30 17:41:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e97 e97: 2 total, 2 up, 2 in
Sep 30 17:41:39 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 97 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=97 pruub=15.864856720s) [0] r=-1 lpr=97 pi=[78,97)/1 crt=54'774 mlcod 0'0 active pruub 212.871124268s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:39 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 97 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=97 pruub=15.864344597s) [0] r=-1 lpr=97 pi=[78,97)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 212.871124268s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:39 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 97 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=97 pruub=15.867403030s) [0] r=-1 lpr=97 pi=[78,97)/1 crt=54'774 mlcod 0'0 active pruub 212.875000000s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:39 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 97 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=97 pruub=15.867344856s) [0] r=-1 lpr=97 pi=[78,97)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 212.875000000s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:39 compute-1 ceph-mon[75484]: 11.10 scrub starts
Sep 30 17:41:39 compute-1 ceph-mon[75484]: 11.10 scrub ok
Sep 30 17:41:39 compute-1 ceph-mon[75484]: 11.f scrub starts
Sep 30 17:41:39 compute-1 ceph-mon[75484]: 11.f scrub ok
Sep 30 17:41:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Sep 30 17:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:40.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:40 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80020d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:40 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003150 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Sep 30 17:41:40 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Sep 30 17:41:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:40.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:40 compute-1 ceph-mon[75484]: pgmap v44: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:40 compute-1 ceph-mon[75484]: 8.13 deep-scrub starts
Sep 30 17:41:40 compute-1 ceph-mon[75484]: 8.13 deep-scrub ok
Sep 30 17:41:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Sep 30 17:41:40 compute-1 ceph-mon[75484]: osdmap e97: 2 total, 2 up, 2 in
Sep 30 17:41:40 compute-1 ceph-mon[75484]: 8.8 scrub starts
Sep 30 17:41:40 compute-1 ceph-mon[75484]: 8.8 scrub ok
Sep 30 17:41:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e98 e98: 2 total, 2 up, 2 in
Sep 30 17:41:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 98 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 98 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=6 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 98 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:41 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 98 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=78/79 n=5 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Sep 30 17:41:41 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Sep 30 17:41:41 compute-1 ceph-mon[75484]: 11.11 scrub starts
Sep 30 17:41:41 compute-1 ceph-mon[75484]: 11.11 scrub ok
Sep 30 17:41:41 compute-1 ceph-mon[75484]: 9.a scrub starts
Sep 30 17:41:41 compute-1 ceph-mon[75484]: 9.a scrub ok
Sep 30 17:41:41 compute-1 ceph-mon[75484]: osdmap e98: 2 total, 2 up, 2 in
Sep 30 17:41:41 compute-1 ceph-mon[75484]: pgmap v47: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:41 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Sep 30 17:41:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e99 e99: 2 total, 2 up, 2 in
Sep 30 17:41:42 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 99 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=5 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] async=[0] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:42 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 99 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=6 ec=58/48 lis/c=78/78 les/c/f=79/79/0 sis=98) [0]/[1] async=[0] r=0 lpr=98 pi=[78,98)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:42 compute-1 sshd-session[89632]: Accepted publickey for zuul from 192.168.122.30 port 49974 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:42 compute-1 systemd-logind[789]: New session 39 of user zuul.
Sep 30 17:41:42 compute-1 systemd[1]: Started Session 39 of User zuul.
Sep 30 17:41:42 compute-1 sshd-session[89632]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:41:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:42.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:42 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:42 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:42.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Sep 30 17:41:42 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 12.10 scrub starts
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 12.10 scrub ok
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 9.d scrub starts
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 9.d scrub ok
Sep 30 17:41:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Sep 30 17:41:42 compute-1 ceph-mon[75484]: osdmap e99: 2 total, 2 up, 2 in
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 12.13 scrub starts
Sep 30 17:41:42 compute-1 ceph-mon[75484]: 12.13 scrub ok
Sep 30 17:41:43 compute-1 python3.9[89786]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 17:41:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e100 e100: 2 total, 2 up, 2 in
Sep 30 17:41:43 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 100 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=6 ec=58/48 lis/c=98/78 les/c/f=99/79/0 sis=100 pruub=14.984440804s) [0] async=[0] r=-1 lpr=100 pi=[78,100)/1 crt=54'774 mlcod 54'774 active pruub 215.260513306s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:43 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 100 pg[10.d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=6 ec=58/48 lis/c=98/78 les/c/f=99/79/0 sis=100 pruub=14.984351158s) [0] r=-1 lpr=100 pi=[78,100)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 215.260513306s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:43 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 100 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=5 ec=58/48 lis/c=98/78 les/c/f=99/79/0 sis=100 pruub=14.980460167s) [0] async=[0] r=-1 lpr=100 pi=[78,100)/1 crt=54'774 mlcod 54'774 active pruub 215.257110596s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:43 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 100 pg[10.1d( v 54'774 (0'0,54'774] local-lis/les=98/99 n=5 ec=58/48 lis/c=98/78 les/c/f=99/79/0 sis=100 pruub=14.980401039s) [0] r=-1 lpr=100 pi=[78,100)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 215.257110596s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Sep 30 17:41:43 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Sep 30 17:41:43 compute-1 ceph-mon[75484]: 11.5 scrub starts
Sep 30 17:41:43 compute-1 ceph-mon[75484]: 11.5 scrub ok
Sep 30 17:41:43 compute-1 ceph-mon[75484]: osdmap e100: 2 total, 2 up, 2 in
Sep 30 17:41:43 compute-1 ceph-mon[75484]: pgmap v50: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Sep 30 17:41:43 compute-1 ceph-mon[75484]: 12.12 scrub starts
Sep 30 17:41:43 compute-1 ceph-mon[75484]: 12.12 scrub ok
Sep 30 17:41:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e101 e101: 2 total, 2 up, 2 in
Sep 30 17:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:44 compute-1 python3.9[89961]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:41:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:44.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:44 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80020d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:44 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:44.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.6 scrub starts
Sep 30 17:41:44 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.6 scrub ok
Sep 30 17:41:44 compute-1 ceph-mon[75484]: 11.4 deep-scrub starts
Sep 30 17:41:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Sep 30 17:41:44 compute-1 ceph-mon[75484]: osdmap e101: 2 total, 2 up, 2 in
Sep 30 17:41:44 compute-1 ceph-mon[75484]: 12.6 scrub starts
Sep 30 17:41:44 compute-1 ceph-mon[75484]: 12.6 scrub ok
Sep 30 17:41:45 compute-1 sudo[90116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebpmixevauiylwkvqacpaodaxaxbqudx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254104.734991-70-239390323955851/AnsiballZ_command.py'
Sep 30 17:41:45 compute-1 sudo[90116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:45 compute-1 python3.9[90118]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:41:45 compute-1 sudo[90116]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:45 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.e scrub starts
Sep 30 17:41:45 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.e scrub ok
Sep 30 17:41:45 compute-1 ceph-mon[75484]: 11.4 deep-scrub ok
Sep 30 17:41:45 compute-1 ceph-mon[75484]: 11.7 scrub starts
Sep 30 17:41:45 compute-1 ceph-mon[75484]: 11.7 scrub ok
Sep 30 17:41:45 compute-1 ceph-mon[75484]: pgmap v52: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 79 B/s, 3 objects/s recovering
Sep 30 17:41:45 compute-1 ceph-mon[75484]: 12.e scrub starts
Sep 30 17:41:45 compute-1 ceph-mon[75484]: 12.e scrub ok
Sep 30 17:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:46 compute-1 sudo[90270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyuufektnievnxascufoeaaxoyiunqxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254105.7643313-94-102345163748478/AnsiballZ_stat.py'
Sep 30 17:41:46 compute-1 sudo[90270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:46 compute-1 python3.9[90272]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:41:46 compute-1 sudo[90270]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:46.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:46 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:46 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:46.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.c scrub starts
Sep 30 17:41:46 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.c scrub ok
Sep 30 17:41:46 compute-1 ceph-mon[75484]: 8.5 scrub starts
Sep 30 17:41:46 compute-1 ceph-mon[75484]: 8.5 scrub ok
Sep 30 17:41:46 compute-1 ceph-mon[75484]: 12.c scrub starts
Sep 30 17:41:46 compute-1 ceph-mon[75484]: 12.c scrub ok
Sep 30 17:41:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:47 compute-1 sudo[90425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhlgvjkinzewhadjkbucobqpjkawetbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254106.7355614-116-5073676904956/AnsiballZ_file.py'
Sep 30 17:41:47 compute-1 sudo[90425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:47 compute-1 python3.9[90427]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:41:47 compute-1 sudo[90425]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.a scrub starts
Sep 30 17:41:47 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.a scrub ok
Sep 30 17:41:47 compute-1 ceph-mon[75484]: 9.7 scrub starts
Sep 30 17:41:47 compute-1 ceph-mon[75484]: 9.7 scrub ok
Sep 30 17:41:47 compute-1 ceph-mon[75484]: pgmap v53: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 54 B/s, 2 objects/s recovering
Sep 30 17:41:47 compute-1 ceph-mon[75484]: 12.a scrub starts
Sep 30 17:41:47 compute-1 ceph-mon[75484]: 12.a scrub ok
Sep 30 17:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:48 compute-1 python3.9[90578]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:41:48 compute-1 network[90595]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:41:48 compute-1 network[90596]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:41:48 compute-1 network[90597]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:41:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:48.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:48 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80020d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:48 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Sep 30 17:41:48 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 11.1 scrub starts
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 11.1 scrub ok
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 12.7 scrub starts
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 12.7 scrub ok
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 11.1b scrub starts
Sep 30 17:41:48 compute-1 ceph-mon[75484]: 11.1b scrub ok
Sep 30 17:41:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Sep 30 17:41:49 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Sep 30 17:41:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e102 e102: 2 total, 2 up, 2 in
Sep 30 17:41:50 compute-1 ceph-mon[75484]: pgmap v54: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 46 B/s, 1 objects/s recovering
Sep 30 17:41:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Sep 30 17:41:50 compute-1 ceph-mon[75484]: 12.3 scrub starts
Sep 30 17:41:50 compute-1 ceph-mon[75484]: 12.3 scrub ok
Sep 30 17:41:50 compute-1 ceph-mon[75484]: 8.19 scrub starts
Sep 30 17:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:50.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:50 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:50 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Sep 30 17:41:50 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Sep 30 17:41:51 compute-1 ceph-mon[75484]: 8.19 scrub ok
Sep 30 17:41:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Sep 30 17:41:51 compute-1 ceph-mon[75484]: osdmap e102: 2 total, 2 up, 2 in
Sep 30 17:41:51 compute-1 ceph-mon[75484]: 12.8 scrub starts
Sep 30 17:41:51 compute-1 ceph-mon[75484]: 12.8 scrub ok
Sep 30 17:41:51 compute-1 ceph-mon[75484]: 9.18 scrub starts
Sep 30 17:41:51 compute-1 ceph-mon[75484]: 9.18 scrub ok
Sep 30 17:41:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.b scrub starts
Sep 30 17:41:51 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.b scrub ok
Sep 30 17:41:52 compute-1 ceph-mon[75484]: pgmap v56: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 41 B/s, 1 objects/s recovering
Sep 30 17:41:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Sep 30 17:41:52 compute-1 ceph-mon[75484]: 12.b scrub starts
Sep 30 17:41:52 compute-1 ceph-mon[75484]: 12.b scrub ok
Sep 30 17:41:52 compute-1 ceph-mon[75484]: 11.1a scrub starts
Sep 30 17:41:52 compute-1 ceph-mon[75484]: 11.1a scrub ok
Sep 30 17:41:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e103 e103: 2 total, 2 up, 2 in
Sep 30 17:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:52.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:52 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80032a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:52 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80032a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Sep 30 17:41:52 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Sep 30 17:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Sep 30 17:41:53 compute-1 ceph-mon[75484]: osdmap e103: 2 total, 2 up, 2 in
Sep 30 17:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:41:53 compute-1 ceph-mon[75484]: 12.18 scrub starts
Sep 30 17:41:53 compute-1 ceph-mon[75484]: 11.1d scrub starts
Sep 30 17:41:53 compute-1 ceph-mon[75484]: 11.1d scrub ok
Sep 30 17:41:53 compute-1 python3.9[90865]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:41:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.1c scrub starts
Sep 30 17:41:53 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.1c scrub ok
Sep 30 17:41:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e104 e104: 2 total, 2 up, 2 in
Sep 30 17:41:54 compute-1 ceph-mon[75484]: 12.18 scrub ok
Sep 30 17:41:54 compute-1 ceph-mon[75484]: pgmap v58: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:54 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Sep 30 17:41:54 compute-1 ceph-mon[75484]: 12.1c scrub starts
Sep 30 17:41:54 compute-1 ceph-mon[75484]: 12.1c scrub ok
Sep 30 17:41:54 compute-1 ceph-mon[75484]: 11.1c scrub starts
Sep 30 17:41:54 compute-1 ceph-mon[75484]: 11.1c scrub ok
Sep 30 17:41:54 compute-1 python3.9[91015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:41:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:54.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:54 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8003980 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:54 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Sep 30 17:41:54 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Sep 30 17:41:55 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Sep 30 17:41:55 compute-1 ceph-mon[75484]: osdmap e104: 2 total, 2 up, 2 in
Sep 30 17:41:55 compute-1 ceph-mon[75484]: 12.19 scrub starts
Sep 30 17:41:55 compute-1 ceph-mon[75484]: 12.19 scrub ok
Sep 30 17:41:55 compute-1 ceph-mon[75484]: 8.1c scrub starts
Sep 30 17:41:55 compute-1 ceph-mon[75484]: 8.1c scrub ok
Sep 30 17:41:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 104 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=104 pruub=9.760772705s) [0] r=-1 lpr=104 pi=[68,104)/1 crt=54'774 mlcod 0'0 active pruub 222.396469116s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:55 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 104 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=104 pruub=9.760516167s) [0] r=-1 lpr=104 pi=[68,104)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 222.396469116s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:55 compute-1 python3.9[91171]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:41:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Sep 30 17:41:55 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Sep 30 17:41:56 compute-1 ceph-mon[75484]: pgmap v60: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Sep 30 17:41:56 compute-1 ceph-mon[75484]: 12.1d scrub starts
Sep 30 17:41:56 compute-1 ceph-mon[75484]: 11.1e scrub starts
Sep 30 17:41:56 compute-1 ceph-mon[75484]: 12.1d scrub ok
Sep 30 17:41:56 compute-1 ceph-mon[75484]: 11.1e scrub ok
Sep 30 17:41:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e105 e105: 2 total, 2 up, 2 in
Sep 30 17:41:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 105 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=105) [0]/[1] r=0 lpr=105 pi=[68,105)/1 crt=54'774 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:56 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 105 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=68/69 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=105) [0]/[1] r=0 lpr=105 pi=[68,105)/1 crt=54'774 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Sep 30 17:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:56.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:56 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b80032a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:56 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:56 compute-1 sudo[91331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybauikucblzwpmrykttdjinsnsxonngn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254116.2657166-212-69863402284781/AnsiballZ_setup.py'
Sep 30 17:41:56 compute-1 sudo[91331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:56.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.9 deep-scrub starts
Sep 30 17:41:56 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 12.9 deep-scrub ok
Sep 30 17:41:56 compute-1 python3.9[91333]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Sep 30 17:41:57 compute-1 ceph-mon[75484]: osdmap e105: 2 total, 2 up, 2 in
Sep 30 17:41:57 compute-1 ceph-mon[75484]: 9.13 deep-scrub starts
Sep 30 17:41:57 compute-1 ceph-mon[75484]: 9.13 deep-scrub ok
Sep 30 17:41:57 compute-1 ceph-mon[75484]: 12.9 deep-scrub starts
Sep 30 17:41:57 compute-1 ceph-mon[75484]: 12.9 deep-scrub ok
Sep 30 17:41:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e106 e106: 2 total, 2 up, 2 in
Sep 30 17:41:57 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 106 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=105/106 n=6 ec=58/48 lis/c=68/68 les/c/f=69/69/0 sis=105) [0]/[1] async=[0] r=0 lpr=105 pi=[68,105)/1 crt=54'774 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:41:57 compute-1 sudo[91331]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:57 compute-1 sudo[91415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvdmwnkvzeaexixwlswioizzlotgmnff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254116.2657166-212-69863402284781/AnsiballZ_dnf.py'
Sep 30 17:41:57 compute-1 sudo[91415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:41:57 compute-1 python3.9[91417]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:41:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Sep 30 17:41:57 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Sep 30 17:41:58 compute-1 ceph-mon[75484]: osdmap e106: 2 total, 2 up, 2 in
Sep 30 17:41:58 compute-1 ceph-mon[75484]: pgmap v63: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:41:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Sep 30 17:41:58 compute-1 ceph-mon[75484]: 8.12 scrub starts
Sep 30 17:41:58 compute-1 ceph-mon[75484]: 8.12 scrub ok
Sep 30 17:41:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e107 e107: 2 total, 2 up, 2 in
Sep 30 17:41:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 107 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=105/106 n=6 ec=58/48 lis/c=105/68 les/c/f=106/69/0 sis=107 pruub=14.990402222s) [0] async=[0] r=-1 lpr=107 pi=[68,107)/1 crt=54'774 mlcod 54'774 active pruub 230.354705811s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:41:58 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 107 pg[10.12( v 54'774 (0'0,54'774] local-lis/les=105/106 n=6 ec=58/48 lis/c=105/68 les/c/f=106/69/0 sis=107 pruub=14.990035057s) [0] r=-1 lpr=107 pi=[68,107)/1 crt=54'774 mlcod 0'0 unknown NOTIFY pruub 230.354705811s@ mbc={}] state<Start>: transitioning to Stray
Sep 30 17:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:41:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:41:58.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:58 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:41:58 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:41:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:41:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:41:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:41:58.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:41:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.5 deep-scrub starts
Sep 30 17:41:58 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.5 deep-scrub ok
Sep 30 17:41:59 compute-1 ceph-mon[75484]: 10.3 scrub starts
Sep 30 17:41:59 compute-1 ceph-mon[75484]: 10.3 scrub ok
Sep 30 17:41:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Sep 30 17:41:59 compute-1 ceph-mon[75484]: osdmap e107: 2 total, 2 up, 2 in
Sep 30 17:41:59 compute-1 ceph-mon[75484]: 9.11 scrub starts
Sep 30 17:41:59 compute-1 ceph-mon[75484]: 9.11 scrub ok
Sep 30 17:41:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e108 e108: 2 total, 2 up, 2 in
Sep 30 17:41:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:41:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:41:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:41:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:41:59 compute-1 sudo[91472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:41:59 compute-1 sudo[91472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:41:59 compute-1 sudo[91472]: pam_unix(sudo:session): session closed for user root
Sep 30 17:41:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Sep 30 17:41:59 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Sep 30 17:42:00 compute-1 ceph-mon[75484]: 10.5 deep-scrub starts
Sep 30 17:42:00 compute-1 ceph-mon[75484]: 10.5 deep-scrub ok
Sep 30 17:42:00 compute-1 ceph-mon[75484]: pgmap v65: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:00 compute-1 ceph-mon[75484]: osdmap e108: 2 total, 2 up, 2 in
Sep 30 17:42:00 compute-1 ceph-mon[75484]: 9.12 scrub starts
Sep 30 17:42:00 compute-1 ceph-mon[75484]: 9.12 scrub ok
Sep 30 17:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:00 compute-1 systemd[83023]: Starting Mark boot as successful...
Sep 30 17:42:00 compute-1 systemd[83023]: Finished Mark boot as successful.
Sep 30 17:42:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:00.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:00 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:00 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:00.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.18 scrub starts
Sep 30 17:42:00 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.18 scrub ok
Sep 30 17:42:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:01 compute-1 ceph-mon[75484]: 10.4 scrub starts
Sep 30 17:42:01 compute-1 ceph-mon[75484]: 10.4 scrub ok
Sep 30 17:42:01 compute-1 ceph-mon[75484]: 8.4 scrub starts
Sep 30 17:42:01 compute-1 ceph-mon[75484]: 8.4 scrub ok
Sep 30 17:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:02 compute-1 ceph-mon[75484]: 10.18 scrub starts
Sep 30 17:42:02 compute-1 ceph-mon[75484]: 10.18 scrub ok
Sep 30 17:42:02 compute-1 ceph-mon[75484]: pgmap v67: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:02 compute-1 ceph-mon[75484]: 11.14 scrub starts
Sep 30 17:42:02 compute-1 ceph-mon[75484]: 11.14 scrub ok
Sep 30 17:42:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:02.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:02 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:02 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:03 compute-1 ceph-mon[75484]: 11.3 scrub starts
Sep 30 17:42:03 compute-1 ceph-mon[75484]: 11.3 scrub ok
Sep 30 17:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:04 compute-1 ceph-mon[75484]: pgmap v68: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:04 compute-1 ceph-mon[75484]: 8.d scrub starts
Sep 30 17:42:04 compute-1 ceph-mon[75484]: 8.d scrub ok
Sep 30 17:42:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:42:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:04.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:04 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:04 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:04.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e109 e109: 2 total, 2 up, 2 in
Sep 30 17:42:05 compute-1 ceph-mon[75484]: 11.e scrub starts
Sep 30 17:42:05 compute-1 ceph-mon[75484]: 11.e scrub ok
Sep 30 17:42:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Sep 30 17:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:06 compute-1 ceph-mon[75484]: pgmap v69: 353 pgs: 353 active+clean; 458 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Sep 30 17:42:06 compute-1 ceph-mon[75484]: osdmap e109: 2 total, 2 up, 2 in
Sep 30 17:42:06 compute-1 ceph-mon[75484]: 9.f scrub starts
Sep 30 17:42:06 compute-1 ceph-mon[75484]: 9.f scrub ok
Sep 30 17:42:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:06.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:06 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:06 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:06.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e110 e110: 2 total, 2 up, 2 in
Sep 30 17:42:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e110 crush map has features 3314933000854323200, adjusting msgr requires
Sep 30 17:42:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e110 crush map has features 432629239337189376, adjusting msgr requires
Sep 30 17:42:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e110 crush map has features 432629239337189376, adjusting msgr requires
Sep 30 17:42:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e110 crush map has features 432629239337189376, adjusting msgr requires
Sep 30 17:42:07 compute-1 ceph-mon[75484]: 10.c scrub starts
Sep 30 17:42:07 compute-1 ceph-mon[75484]: 10.c scrub ok
Sep 30 17:42:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "10.a", "id": [0, 1]}]: dispatch
Sep 30 17:42:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "10.e", "id": [0, 1]}]: dispatch
Sep 30 17:42:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Sep 30 17:42:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:42:07 compute-1 ceph-osd[78006]: osd.1 110 crush map has features 432629239337189376, adjusting msgr requires for clients
Sep 30 17:42:07 compute-1 ceph-osd[78006]: osd.1 110 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Sep 30 17:42:07 compute-1 ceph-osd[78006]: osd.1 110 crush map has features 3314933000854323200, adjusting msgr requires for osds
Sep 30 17:42:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 110 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=110) [1] r=0 lpr=110 pi=[80,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:07 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 110 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=91/91 les/c/f=92/92/0 sis=110) [1] r=0 lpr=110 pi=[91,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e111 e111: 2 total, 2 up, 2 in
Sep 30 17:42:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 111 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=91/91 les/c/f=92/92/0 sis=111) [1]/[0] r=-1 lpr=111 pi=[91,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 111 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=91/91 les/c/f=92/92/0 sis=111) [1]/[0] r=-1 lpr=111 pi=[91,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:42:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 111 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=111) [1]/[0] r=-1 lpr=111 pi=[80,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:08 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 111 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=111) [1]/[0] r=-1 lpr=111 pi=[80,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:42:08 compute-1 ceph-mon[75484]: pgmap v71: 353 pgs: 353 active+clean; 458 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "10.a", "id": [0, 1]}]': finished
Sep 30 17:42:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "10.e", "id": [0, 1]}]': finished
Sep 30 17:42:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Sep 30 17:42:08 compute-1 ceph-mon[75484]: osdmap e110: 2 total, 2 up, 2 in
Sep 30 17:42:08 compute-1 ceph-mon[75484]: 10.d scrub starts
Sep 30 17:42:08 compute-1 ceph-mon[75484]: 10.d scrub ok
Sep 30 17:42:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:08.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:08 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:42:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:08.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:42:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:09 compute-1 ceph-mon[75484]: osdmap e111: 2 total, 2 up, 2 in
Sep 30 17:42:09 compute-1 ceph-mon[75484]: 10.b scrub starts
Sep 30 17:42:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e112 e112: 2 total, 2 up, 2 in
Sep 30 17:42:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e113 e113: 2 total, 2 up, 2 in
Sep 30 17:42:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 113 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=111/91 les/c/f=112/92/0 sis=113) [1] r=0 lpr=113 pi=[91,113)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 113 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=0/0 n=6 ec=58/48 lis/c=111/91 les/c/f=112/92/0 sis=113) [1] r=0 lpr=113 pi=[91,113)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 113 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=111/80 les/c/f=112/81/0 sis=113) [1] r=0 lpr=113 pi=[80,113)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:09 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 113 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=111/80 les/c/f=112/81/0 sis=113) [1] r=0 lpr=113 pi=[80,113)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:10 compute-1 ceph-mon[75484]: 10.b scrub ok
Sep 30 17:42:10 compute-1 ceph-mon[75484]: pgmap v74: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:10 compute-1 ceph-mon[75484]: osdmap e112: 2 total, 2 up, 2 in
Sep 30 17:42:10 compute-1 ceph-mon[75484]: osdmap e113: 2 total, 2 up, 2 in
Sep 30 17:42:10 compute-1 ceph-mon[75484]: 10.19 deep-scrub starts
Sep 30 17:42:10 compute-1 ceph-mon[75484]: 10.19 deep-scrub ok
Sep 30 17:42:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:10 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:10 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e114 e114: 2 total, 2 up, 2 in
Sep 30 17:42:10 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 114 pg[10.a( v 54'774 (0'0,54'774] local-lis/les=113/114 n=6 ec=58/48 lis/c=111/91 les/c/f=112/92/0 sis=113) [1] r=0 lpr=113 pi=[91,113)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:42:10 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 114 pg[10.e( v 54'774 (0'0,54'774] local-lis/les=113/114 n=5 ec=58/48 lis/c=111/80 les/c/f=112/81/0 sis=113) [1] r=0 lpr=113 pi=[80,113)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:42:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:10.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.a scrub starts
Sep 30 17:42:11 compute-1 ceph-osd[78006]: log_channel(cluster) log [DBG] : 10.a scrub ok
Sep 30 17:42:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:11 compute-1 ceph-mon[75484]: osdmap e114: 2 total, 2 up, 2 in
Sep 30 17:42:11 compute-1 ceph-mon[75484]: 10.1a scrub starts
Sep 30 17:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:12 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:12 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:12 compute-1 ceph-mon[75484]: 10.1a scrub ok
Sep 30 17:42:12 compute-1 ceph-mon[75484]: 10.a scrub starts
Sep 30 17:42:12 compute-1 ceph-mon[75484]: 10.a scrub ok
Sep 30 17:42:12 compute-1 ceph-mon[75484]: pgmap v78: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:12 compute-1 ceph-mon[75484]: 10.1b deep-scrub starts
Sep 30 17:42:12 compute-1 ceph-mon[75484]: 10.1b deep-scrub ok
Sep 30 17:42:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:12.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:13 compute-1 ceph-mon[75484]: 10.1c scrub starts
Sep 30 17:42:13 compute-1 ceph-mon[75484]: 10.1c scrub ok
Sep 30 17:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:14.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:14 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:14 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:14 compute-1 ceph-mon[75484]: pgmap v79: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:14 compute-1 ceph-mon[75484]: 10.1d scrub starts
Sep 30 17:42:14 compute-1 ceph-mon[75484]: 10.1d scrub ok
Sep 30 17:42:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:14.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e115 e115: 2 total, 2 up, 2 in
Sep 30 17:42:15 compute-1 ceph-mon[75484]: 10.1e scrub starts
Sep 30 17:42:15 compute-1 ceph-mon[75484]: 10.1e scrub ok
Sep 30 17:42:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Sep 30 17:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:16.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:16 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003540 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:16 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003540 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:16 compute-1 ceph-mon[75484]: pgmap v80: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Sep 30 17:42:16 compute-1 ceph-mon[75484]: osdmap e115: 2 total, 2 up, 2 in
Sep 30 17:42:16 compute-1 ceph-mon[75484]: 10.1f deep-scrub starts
Sep 30 17:42:16 compute-1 ceph-mon[75484]: 10.1f deep-scrub ok
Sep 30 17:42:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:42:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:16.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:42:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:17 compute-1 sshd-session[91578]: Invalid user mas from 84.51.43.58 port 40163
Sep 30 17:42:17 compute-1 sshd-session[91578]: Received disconnect from 84.51.43.58 port 40163:11: Bye Bye [preauth]
Sep 30 17:42:17 compute-1 sshd-session[91578]: Disconnected from invalid user mas 84.51.43.58 port 40163 [preauth]
Sep 30 17:42:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e116 e116: 2 total, 2 up, 2 in
Sep 30 17:42:17 compute-1 ceph-mon[75484]: 10.10 scrub starts
Sep 30 17:42:17 compute-1 ceph-mon[75484]: 10.10 scrub ok
Sep 30 17:42:17 compute-1 ceph-mon[75484]: pgmap v82: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Sep 30 17:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:18.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:18 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003540 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:18 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:18.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Sep 30 17:42:18 compute-1 ceph-mon[75484]: osdmap e116: 2 total, 2 up, 2 in
Sep 30 17:42:18 compute-1 ceph-mon[75484]: 10.12 deep-scrub starts
Sep 30 17:42:18 compute-1 ceph-mon[75484]: 10.12 deep-scrub ok
Sep 30 17:42:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:19 compute-1 sudo[91582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:42:19 compute-1 sudo[91582]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:19 compute-1 sudo[91582]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e117 e117: 2 total, 2 up, 2 in
Sep 30 17:42:19 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 117 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=117) [1] r=0 lpr=117 pi=[58,117)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:19 compute-1 ceph-mon[75484]: 10.11 scrub starts
Sep 30 17:42:19 compute-1 ceph-mon[75484]: 10.11 scrub ok
Sep 30 17:42:19 compute-1 ceph-mon[75484]: pgmap v84: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:19 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Sep 30 17:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:20.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:20 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003540 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:20 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69dc003540 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:20.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:20 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Sep 30 17:42:20 compute-1 ceph-mon[75484]: osdmap e117: 2 total, 2 up, 2 in
Sep 30 17:42:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e118 e118: 2 total, 2 up, 2 in
Sep 30 17:42:20 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 118 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=118) [1]/[0] r=-1 lpr=118 pi=[58,118)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:20 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 118 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=118) [1]/[0] r=-1 lpr=118 pi=[58,118)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:42:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:21 compute-1 ceph-mon[75484]: osdmap e118: 2 total, 2 up, 2 in
Sep 30 17:42:21 compute-1 ceph-mon[75484]: pgmap v87: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Sep 30 17:42:21 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e119 e119: 2 total, 2 up, 2 in
Sep 30 17:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:22.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:22 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:22.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Sep 30 17:42:22 compute-1 ceph-mon[75484]: osdmap e119: 2 total, 2 up, 2 in
Sep 30 17:42:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:42:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:23 compute-1 sshd-session[91633]: Invalid user admin from 167.71.248.239 port 47364
Sep 30 17:42:23 compute-1 sshd-session[91633]: Connection closed by invalid user admin 167.71.248.239 port 47364 [preauth]
Sep 30 17:42:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174223 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:42:23 compute-1 ceph-mon[75484]: pgmap v89: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:42:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Sep 30 17:42:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e120 e120: 2 total, 2 up, 2 in
Sep 30 17:42:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 120 pg[10.19( v 54'774 (0'0,54'774] local-lis/les=0/0 n=7 ec=58/48 lis/c=118/58 les/c/f=119/59/0 sis=120) [1] r=0 lpr=120 pi=[58,120)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 120 pg[10.19( v 54'774 (0'0,54'774] local-lis/les=0/0 n=7 ec=58/48 lis/c=118/58 les/c/f=119/59/0 sis=120) [1] r=0 lpr=120 pi=[58,120)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:23 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 120 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=93/93 les/c/f=94/94/0 sis=120) [1] r=0 lpr=120 pi=[93,120)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:24 compute-1 sshd-session[91635]: Invalid user superadmin from 167.172.43.167 port 49726
Sep 30 17:42:24 compute-1 sshd-session[91635]: Received disconnect from 167.172.43.167 port 49726:11: Bye Bye [preauth]
Sep 30 17:42:24 compute-1 sshd-session[91635]: Disconnected from invalid user superadmin 167.172.43.167 port 49726 [preauth]
Sep 30 17:42:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:24.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0001760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:24 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0002670 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e121 e121: 2 total, 2 up, 2 in
Sep 30 17:42:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 121 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=93/93 les/c/f=94/94/0 sis=121) [1]/[0] r=-1 lpr=121 pi=[93,121)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 121 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=93/93 les/c/f=94/94/0 sis=121) [1]/[0] r=-1 lpr=121 pi=[93,121)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:42:24 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 121 pg[10.19( v 54'774 (0'0,54'774] local-lis/les=120/121 n=7 ec=58/48 lis/c=118/58 les/c/f=119/59/0 sis=120) [1] r=0 lpr=120 pi=[58,120)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:42:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:24.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Sep 30 17:42:24 compute-1 ceph-mon[75484]: osdmap e120: 2 total, 2 up, 2 in
Sep 30 17:42:24 compute-1 ceph-mon[75484]: osdmap e121: 2 total, 2 up, 2 in
Sep 30 17:42:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e122 e122: 2 total, 2 up, 2 in
Sep 30 17:42:25 compute-1 ceph-mon[75484]: pgmap v92: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 25 B/s, 1 objects/s recovering
Sep 30 17:42:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Sep 30 17:42:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Sep 30 17:42:25 compute-1 ceph-mon[75484]: osdmap e122: 2 total, 2 up, 2 in
Sep 30 17:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:26.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69bc003e60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:26 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003c70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e123 e123: 2 total, 2 up, 2 in
Sep 30 17:42:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 123 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=2 ec=58/48 lis/c=121/93 les/c/f=122/94/0 sis=123) [1] r=0 lpr=123 pi=[93,123)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:26 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 123 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=0/0 n=2 ec=58/48 lis/c=121/93 les/c/f=122/94/0 sis=123) [1] r=0 lpr=123 pi=[93,123)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:26.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e124 e124: 2 total, 2 up, 2 in
Sep 30 17:42:27 compute-1 ceph-mon[75484]: osdmap e123: 2 total, 2 up, 2 in
Sep 30 17:42:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Sep 30 17:42:27 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 124 pg[10.1b( v 54'774 (0'0,54'774] local-lis/les=123/124 n=2 ec=58/48 lis/c=121/93 les/c/f=122/94/0 sis=123) [1] r=0 lpr=123 pi=[93,123)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:28.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b8001b40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:28 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69d0002670 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:28 compute-1 ceph-mon[75484]: pgmap v95: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 27 B/s, 1 objects/s recovering
Sep 30 17:42:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Sep 30 17:42:28 compute-1 ceph-mon[75484]: osdmap e124: 2 total, 2 up, 2 in
Sep 30 17:42:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:28.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Sep 30 17:42:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Sep 30 17:42:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e125 e125: 2 total, 2 up, 2 in
Sep 30 17:42:29 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 125 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=125) [1] r=0 lpr=125 pi=[80,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:30.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69c8000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[86009]: 30/09/2025 17:42:30 : epoch 68dc15f8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f69b0003d20 fd 47 proxy ignored for local
Sep 30 17:42:30 compute-1 kernel: ganesha.nfsd[87136]: segfault at 50 ip 00007f6a90c1a32e sp 00007f6a4cff8210 error 4 in libntirpc.so.5.8[7f6a90bff000+2c000] likely on CPU 3 (core 0, socket 3)
Sep 30 17:42:30 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:42:30 compute-1 systemd[1]: Created slice Slice /system/systemd-coredump.
Sep 30 17:42:30 compute-1 systemd[1]: Started Process Core Dump (PID 91652/UID 0).
Sep 30 17:42:30 compute-1 ceph-mon[75484]: pgmap v97: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 227 B/s rd, 0 op/s; 24 B/s, 0 objects/s recovering
Sep 30 17:42:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Sep 30 17:42:30 compute-1 ceph-mon[75484]: osdmap e125: 2 total, 2 up, 2 in
Sep 30 17:42:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e126 e126: 2 total, 2 up, 2 in
Sep 30 17:42:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 126 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=126) [1]/[0] r=-1 lpr=126 pi=[80,126)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:30 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 126 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=80/80 les/c/f=81/81/0 sis=126) [1]/[0] r=-1 lpr=126 pi=[80,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Sep 30 17:42:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:30.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:31 compute-1 sudo[91654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:42:31 compute-1 sudo[91654]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:31 compute-1 sudo[91654]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:31 compute-1 sudo[91679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:42:31 compute-1 sudo[91679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:31 compute-1 sudo[91679]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e127 e127: 2 total, 2 up, 2 in
Sep 30 17:42:31 compute-1 ceph-mon[75484]: osdmap e126: 2 total, 2 up, 2 in
Sep 30 17:42:31 compute-1 systemd-coredump[91653]: Process 86013 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 56:
                                                   #0  0x00007f6a90c1a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Sep 30 17:42:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Sep 30 17:42:31 compute-1 systemd[1]: systemd-coredump@0-91652-0.service: Deactivated successfully.
Sep 30 17:42:31 compute-1 systemd[1]: systemd-coredump@0-91652-0.service: Consumed 1.217s CPU time.
Sep 30 17:42:32 compute-1 podman[91741]: 2025-09-30 17:42:32.001466221 +0000 UTC m=+0.046184216 container died a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:42:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-c2cda0b17b54b5bce24502617a6d6723de9dd36b9e2d0a1d7df85cd62cc7b0d9-merged.mount: Deactivated successfully.
Sep 30 17:42:32 compute-1 podman[91741]: 2025-09-30 17:42:32.05029566 +0000 UTC m=+0.095013645 container remove a2c50f357f6fb33c1e67fb0f3db8a2d1059e8e7cde2694b4c0415d513f83b70b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325)
Sep 30 17:42:32 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:42:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:32 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:42:32 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.068s CPU time.
Sep 30 17:42:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:32.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:32 compute-1 ceph-mon[75484]: pgmap v100: 353 pgs: 353 active+clean; 457 KiB data, 99 MiB used, 40 GiB / 40 GiB avail; 227 B/s rd, 0 op/s; 24 B/s, 0 objects/s recovering
Sep 30 17:42:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Sep 30 17:42:32 compute-1 ceph-mon[75484]: osdmap e127: 2 total, 2 up, 2 in
Sep 30 17:42:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:32.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e128 e128: 2 total, 2 up, 2 in
Sep 30 17:42:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 128 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=126/80 les/c/f=127/81/0 sis=128) [1] r=0 lpr=128 pi=[80,128)/1 luod=0'0 crt=54'774 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Sep 30 17:42:32 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 128 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=0/0 n=5 ec=58/48 lis/c=126/80 les/c/f=127/81/0 sis=128) [1] r=0 lpr=128 pi=[80,128)/1 crt=54'774 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Sep 30 17:42:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:33 compute-1 ceph-mon[75484]: osdmap e128: 2 total, 2 up, 2 in
Sep 30 17:42:33 compute-1 ceph-mon[75484]: pgmap v103: 353 pgs: 353 active+clean; 457 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 0 B/s rd, 0 op/s
Sep 30 17:42:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 e129: 2 total, 2 up, 2 in
Sep 30 17:42:33 compute-1 ceph-osd[78006]: osd.1 pg_epoch: 129 pg[10.1e( v 54'774 (0'0,54'774] local-lis/les=128/129 n=5 ec=58/48 lis/c=126/80 les/c/f=127/81/0 sis=128) [1] r=0 lpr=128 pi=[80,128)/1 crt=54'774 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Sep 30 17:42:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:34.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:34.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:34 compute-1 ceph-mon[75484]: osdmap e129: 2 total, 2 up, 2 in
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:42:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:42:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:35 compute-1 ceph-mon[75484]: pgmap v105: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 695 B/s wr, 2 op/s; 0 B/s, 1 objects/s recovering
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:36.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174236 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [NOTICE] 272/174236 (4) : haproxy version is 2.3.17-d1c9119
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [NOTICE] 272/174236 (4) : path to executable is /usr/local/sbin/haproxy
Sep 30 17:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/174236 (4) : backend 'backend' has no server available!
Sep 30 17:42:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:36.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:42:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:38 compute-1 ceph-mon[75484]: pgmap v106: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s; 0 B/s, 0 objects/s recovering
Sep 30 17:42:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:38.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:38 compute-1 sudo[91791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:42:38 compute-1 sudo[91791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:38 compute-1 sudo[91791]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:38.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:42:39 compute-1 sudo[91816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:42:39 compute-1 sudo[91816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:39 compute-1 sudo[91816]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:40.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:40 compute-1 ceph-mon[75484]: pgmap v107: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 830 B/s wr, 2 op/s; 0 B/s, 0 objects/s recovering
Sep 30 17:42:40 compute-1 ceph-mon[75484]: mgrmap e31: compute-0.efvthf(active, since 92s), standbys: compute-1.glbusf
Sep 30 17:42:40 compute-1 sudo[91415]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:40.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:41 compute-1 sudo[91992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzyxwxijriwkhrwzzelxujtacepkkkmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254160.936144-236-138459750529816/AnsiballZ_command.py'
Sep 30 17:42:41 compute-1 sudo[91992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:41 compute-1 python3.9[91994]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:42 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 1.
Sep 30 17:42:42 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:42:42 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.068s CPU time.
Sep 30 17:42:42 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:42:42 compute-1 sudo[91992]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:42.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:42 compute-1 ceph-mon[75484]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 733 B/s wr, 2 op/s; 0 B/s, 0 objects/s recovering
Sep 30 17:42:42 compute-1 podman[92228]: 2025-09-30 17:42:42.772085691 +0000 UTC m=+0.080767578 container create e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:42:42 compute-1 podman[92228]: 2025-09-30 17:42:42.737064388 +0000 UTC m=+0.045746335 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:42:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:42.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3a3ab60e4f721510f7239e702c8a4f7bbf12660115cd8d2cbc8ccde350e2a5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:42:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3a3ab60e4f721510f7239e702c8a4f7bbf12660115cd8d2cbc8ccde350e2a5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:42:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3a3ab60e4f721510f7239e702c8a4f7bbf12660115cd8d2cbc8ccde350e2a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:42:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3a3ab60e4f721510f7239e702c8a4f7bbf12660115cd8d2cbc8ccde350e2a5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:42:42 compute-1 podman[92228]: 2025-09-30 17:42:42.869432441 +0000 UTC m=+0.178114378 container init e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:42:42 compute-1 podman[92228]: 2025-09-30 17:42:42.884374865 +0000 UTC m=+0.193056742 container start e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:42:42 compute-1 bash[92228]: e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:42:42 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:42:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:43 compute-1 sudo[92381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjowuwzvozwgbxidycjfefbhsawohbgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254162.6528473-253-103842182069901/AnsiballZ_selinux.py'
Sep 30 17:42:43 compute-1 sudo[92381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:43 compute-1 python3.9[92383]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 17:42:43 compute-1 sudo[92381]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:44 compute-1 sudo[92534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzayoefmrgdxuoecktjkugrzvvsukzoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254164.0537298-274-57412593695163/AnsiballZ_command.py'
Sep 30 17:42:44 compute-1 sudo[92534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:44.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:44 compute-1 python3.9[92536]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 17:42:44 compute-1 sudo[92534]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:44 compute-1 ceph-mon[75484]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 614 B/s wr, 1 op/s; 0 B/s, 0 objects/s recovering
Sep 30 17:42:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:44.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:45 compute-1 sudo[92687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hglddqlqkbohdapevxwwzrchovmucdda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254164.8525355-290-135450005262883/AnsiballZ_file.py'
Sep 30 17:42:45 compute-1 sudo[92687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:45 compute-1 python3.9[92689]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:42:45 compute-1 sudo[92687]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:46 compute-1 sudo[92840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anaujgddnqzuupufmhyprsebbbbpkeup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254165.700981-306-158435794917112/AnsiballZ_mount.py'
Sep 30 17:42:46 compute-1 sudo[92840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:46 compute-1 python3.9[92842]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 17:42:46 compute-1 sudo[92840]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:46.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:46 compute-1 ceph-mon[75484]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 541 B/s wr, 1 op/s
Sep 30 17:42:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:46.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:47 compute-1 sudo[92993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykafmjlhyukstqmqikhlzgnbntfcexzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254167.300065-362-154013953303789/AnsiballZ_file.py'
Sep 30 17:42:47 compute-1 sudo[92993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:47 compute-1 python3.9[92995]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:42:47 compute-1 sudo[92993]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:48 compute-1 sudo[93146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfvwtcqsqbzpwoohnvjjnkoetnilitdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254168.0833223-378-255213318632314/AnsiballZ_stat.py'
Sep 30 17:42:48 compute-1 sudo[93146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:48.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:48 compute-1 python3.9[93148]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:42:48 compute-1 ceph-mon[75484]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:42:48 compute-1 sudo[93146]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:48.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:49 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:42:49 compute-1 sudo[93225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrpivbjsoiydsajexuwksqmizlcwvlgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254168.0833223-378-255213318632314/AnsiballZ_file.py'
Sep 30 17:42:49 compute-1 sudo[93225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:49 compute-1 python3.9[93227]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:42:49 compute-1 sudo[93225]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:50 compute-1 sudo[93378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-difwzsjuhcdrqhslmiofulltzcghqeuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254170.018893-426-94334572912756/AnsiballZ_getent.py'
Sep 30 17:42:50 compute-1 sudo[93378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:42:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:50.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:42:50 compute-1 python3.9[93380]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 17:42:50 compute-1 sudo[93378]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:50 compute-1 ceph-mon[75484]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:42:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:50.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:51 compute-1 sudo[93532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noobqxaujjrhkjzvjqluelpwyeedqrtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254170.9985282-446-148388610901514/AnsiballZ_getent.py'
Sep 30 17:42:51 compute-1 sudo[93532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174251 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:42:51 compute-1 python3.9[93534]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 17:42:51 compute-1 sudo[93532]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:52 compute-1 sudo[93686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmlgjfsfivtnrrroeluzfxjaytwrouww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254171.859596-462-71087855223081/AnsiballZ_group.py'
Sep 30 17:42:52 compute-1 sudo[93686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:52 compute-1 python3.9[93688]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:42:52 compute-1 sudo[93686]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:52.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:52 compute-1 ceph-mon[75484]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:42:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:42:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:52.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:53 compute-1 sudo[93839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztckppzstjknsvzwwzhgdgbtieakwnuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254172.8049145-480-53329486844605/AnsiballZ_file.py'
Sep 30 17:42:53 compute-1 sudo[93839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:53 compute-1 python3.9[93841]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 17:42:53 compute-1 sudo[93839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:53 compute-1 ceph-mon[75484]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:42:54 compute-1 sudo[93992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fayimyyrhrofjcbzzogapozffmpvactd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254173.7701426-502-205108094531456/AnsiballZ_dnf.py'
Sep 30 17:42:54 compute-1 sudo[93992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:54 compute-1 python3.9[93994]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:42:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:54.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:54.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000004:nfs.cephfs.0: -2
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:55 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:55 compute-1 sudo[93992]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:56 compute-1 sudo[94159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvnwyaqtviruzycaevhrqdrkiwiftbwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254175.9469097-518-7427158524828/AnsiballZ_file.py'
Sep 30 17:42:56 compute-1 ceph-mon[75484]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Sep 30 17:42:56 compute-1 sudo[94159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:56 compute-1 python3.9[94161]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:42:56 compute-1 sudo[94159]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:56.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:56 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9c0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:56 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b4001c00 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:56.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:57 compute-1 sudo[94319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyccugbxkhuioamsxrixezevibyaidua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254176.7114294-534-248640455342670/AnsiballZ_stat.py'
Sep 30 17:42:57 compute-1 sudo[94319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:57 compute-1 python3.9[94321]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:42:57 compute-1 sudo[94319]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:57 compute-1 sudo[94397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yojtstuyhadgjuxlgzyaokpvapboohsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254176.7114294-534-248640455342670/AnsiballZ_file.py'
Sep 30 17:42:57 compute-1 sudo[94397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:57 compute-1 python3.9[94399]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:42:57 compute-1 sudo[94397]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:58 compute-1 ceph-mon[75484]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:42:58 compute-1 sudo[94550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebgbffnoguguiukcyhffbwuritxluxjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254178.0421307-560-39201674997589/AnsiballZ_stat.py'
Sep 30 17:42:58 compute-1 sudo[94550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:42:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:42:58.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174258 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:58 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4000e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:42:58 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:42:58 compute-1 python3.9[94552]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:42:58 compute-1 sudo[94550]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:42:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:42:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:42:58.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:42:58 compute-1 sudo[94629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fievrtsospjwhfwhgbqpphlzzgvtidhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254178.0421307-560-39201674997589/AnsiballZ_file.py'
Sep 30 17:42:58 compute-1 sudo[94629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:59 compute-1 python3.9[94631]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:42:59 compute-1 sudo[94629]: pam_unix(sudo:session): session closed for user root
Sep 30 17:42:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:42:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:42:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:42:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:42:59 compute-1 sudo[94782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toxflfxhkwispkptwgqajiptbnuxqeyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254179.5451007-590-27599504789028/AnsiballZ_dnf.py'
Sep 30 17:42:59 compute-1 sudo[94782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:42:59 compute-1 sudo[94781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:42:59 compute-1 sudo[94781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:42:59 compute-1 sudo[94781]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:00 compute-1 python3.9[94796]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:00 compute-1 ceph-mon[75484]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:43:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:00.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:00 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:00 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:00.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:01 compute-1 sudo[94782]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:02 compute-1 ceph-mon[75484]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Sep 30 17:43:02 compute-1 python3.9[94962]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:43:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000048s ======
Sep 30 17:43:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:02.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Sep 30 17:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:02 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4001940 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:02 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9900016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:43:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:02.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:43:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:03 compute-1 python3.9[95115]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 17:43:04 compute-1 python3.9[95265]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:04 compute-1 ceph-mon[75484]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Sep 30 17:43:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:04.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:04 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:04 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:04.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:05 compute-1 sudo[95417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fscmandzjhbcoqxhlkxjmxplfuogduih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254184.4729671-672-270390672502234/AnsiballZ_systemd.py'
Sep 30 17:43:05 compute-1 sudo[95417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:05 compute-1 python3.9[95419]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:43:05 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 17:43:05 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 17:43:05 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 17:43:05 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 17:43:05 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 17:43:05 compute-1 sudo[95417]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:06 compute-1 ceph-mon[75484]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Sep 30 17:43:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:06 compute-1 python3.9[95582]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 17:43:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:06.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:06 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4001940 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:06 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9900016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:06.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:08 compute-1 ceph-mon[75484]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:43:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:08.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:08 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:08 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:08 compute-1 sudo[95735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hapfihripcduqrmjtgbqculoppatddxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254188.4889388-786-250686647958446/AnsiballZ_systemd.py'
Sep 30 17:43:08 compute-1 sudo[95735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:08.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:09 compute-1 python3.9[95737]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:43:09 compute-1 sudo[95735]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:09 compute-1 sudo[95889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hktwkugoizpfiusscuzgfecfwtqmhpwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254189.3026443-786-54778563329010/AnsiballZ_systemd.py'
Sep 30 17:43:09 compute-1 sudo[95889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:09 compute-1 python3.9[95891]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:43:10 compute-1 sudo[95889]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:10 compute-1 ceph-mon[75484]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:43:10 compute-1 sshd-session[89635]: Connection closed by 192.168.122.30 port 49974
Sep 30 17:43:10 compute-1 sshd-session[89632]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:43:10 compute-1 systemd-logind[789]: Session 39 logged out. Waiting for processes to exit.
Sep 30 17:43:10 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Sep 30 17:43:10 compute-1 systemd[1]: session-39.scope: Consumed 1min 9.407s CPU time.
Sep 30 17:43:10 compute-1 systemd-logind[789]: Removed session 39.
Sep 30 17:43:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:43:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:10.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:10 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4001940 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:10 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9900016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000048s ======
Sep 30 17:43:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:10.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Sep 30 17:43:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:12 compute-1 ceph-mon[75484]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:12.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:12 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:12 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:12.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:14 compute-1 ceph-mon[75484]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:14.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:14 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4002db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:14 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:14.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:15 compute-1 sshd-session[95924]: Accepted publickey for zuul from 192.168.122.30 port 39866 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:43:15 compute-1 systemd-logind[789]: New session 40 of user zuul.
Sep 30 17:43:15 compute-1 systemd[1]: Started Session 40 of User zuul.
Sep 30 17:43:15 compute-1 sshd-session[95924]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:16 compute-1 ceph-mon[75484]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:16.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:16 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:16 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:16.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:17 compute-1 python3.9[96079]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:18 compute-1 sudo[96234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhdidrrghgjatxwmtnvnqlfcwxfniqxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254197.7704122-53-502048878012/AnsiballZ_getent.py'
Sep 30 17:43:18 compute-1 sudo[96234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:18 compute-1 python3.9[96236]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 17:43:18 compute-1 ceph-mon[75484]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:18 compute-1 sudo[96234]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:18.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:18 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4002db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:18 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4002db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000024s ======
Sep 30 17:43:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:18.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Sep 30 17:43:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:19 compute-1 sudo[96388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqilnbpvkrbjbbyftpnoadykkqzakeoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254198.8879056-77-230838966663550/AnsiballZ_setup.py'
Sep 30 17:43:19 compute-1 sudo[96388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:19 compute-1 python3.9[96390]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:43:19 compute-1 sudo[96388]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:20 compute-1 sudo[96400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:43:20 compute-1 sudo[96400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:43:20 compute-1 sudo[96400]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:20 compute-1 sudo[96498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpawukcoqyayjrykhysfyqtezjaihyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254198.8879056-77-230838966663550/AnsiballZ_dnf.py'
Sep 30 17:43:20 compute-1 sudo[96498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:20 compute-1 ceph-mon[75484]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:43:20 compute-1 python3.9[96500]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 17:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:20 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:20.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:20 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000025s ======
Sep 30 17:43:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:20.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Sep 30 17:43:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:22 compute-1 sudo[96498]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:22 compute-1 ceph-mon[75484]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:43:22 compute-1 sudo[96654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujynjudgkfamischfcrfcmjfkvemwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254202.297736-105-263606128458647/AnsiballZ_dnf.py'
Sep 30 17:43:22 compute-1 sudo[96654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:22 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:22.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:22 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:22 compute-1 python3.9[96656]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:43:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:22.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:24 compute-1 sudo[96654]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:24 compute-1 ceph-mon[75484]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:24 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:24.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:24 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:24.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:25 compute-1 sudo[96809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiirurgkqhjtclaruijszlhjxhjvjbpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254204.4536688-121-177495637970421/AnsiballZ_systemd.py'
Sep 30 17:43:25 compute-1 sudo[96809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:25 compute-1 python3.9[96811]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:43:25 compute-1 sudo[96809]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174325 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:26 compute-1 python3.9[96965]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:26 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:43:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:26.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:43:26 compute-1 ceph-mon[75484]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:26 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:26.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:27 compute-1 sudo[97116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkrkymxmrtnwruattxryefdxyzjfsben ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254206.7512937-157-104330252061176/AnsiballZ_sefcontext.py'
Sep 30 17:43:27 compute-1 sudo[97116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:27 compute-1 python3.9[97118]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 17:43:27 compute-1 sudo[97116]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:28 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:28.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:28 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:28 compute-1 ceph-mon[75484]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:43:28 compute-1 python3.9[97269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:28.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:29 compute-1 sudo[97428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvixofoagarnyjselzzkqwqfnwxvhveo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254209.2567027-193-68251135760620/AnsiballZ_dnf.py'
Sep 30 17:43:29 compute-1 sudo[97428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:29 compute-1 python3.9[97430]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:30 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b4001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:30 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:30 compute-1 ceph-mon[75484]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:43:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:30.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:31 compute-1 sudo[97428]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:31 compute-1 sudo[97583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nktpuvrpysvleyqgtcmbpecfirytubvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254211.4028728-209-15607092796708/AnsiballZ_command.py'
Sep 30 17:43:31 compute-1 sudo[97583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:32 compute-1 python3.9[97585]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:32 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c001240 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:32 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:32 compute-1 ceph-mon[75484]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:43:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:32.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:32 compute-1 sudo[97583]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:33 compute-1 sudo[97872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smfqkcntyfepkzndbtzobvhzjnccwmir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254213.174309-225-161077338943951/AnsiballZ_file.py'
Sep 30 17:43:33 compute-1 sudo[97872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:33 compute-1 python3.9[97874]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 17:43:33 compute-1 sudo[97872]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:34 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40019c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:34 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:43:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:34 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:43:34 compute-1 ceph-mon[75484]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:43:34 compute-1 python3.9[98025]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:43:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:34.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:35 compute-1 sudo[98178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwgevwylgsrxacdnzedvpvzgqiwyykdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254214.957804-257-198593991133267/AnsiballZ_dnf.py'
Sep 30 17:43:35 compute-1 sudo[98178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:35 compute-1 python3.9[98180]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:36 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c001d40 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:36 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:36.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:36 compute-1 ceph-mon[75484]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:43:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:36.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:36 compute-1 sudo[98178]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:37 compute-1 sudo[98333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnyajchzxzxzultxbpbffleujodjgwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254217.213821-275-75503056621998/AnsiballZ_dnf.py'
Sep 30 17:43:37 compute-1 sudo[98333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:37 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:37 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:43:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:43:37 compute-1 python3.9[98335]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:38 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40019c0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:38 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:38.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:38 compute-1 ceph-mon[75484]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:43:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:38.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:38 compute-1 sudo[98339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:43:38 compute-1 sudo[98339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:43:38 compute-1 sudo[98339]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:38 compute-1 sudo[98364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:43:38 compute-1 sudo[98364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:43:39 compute-1 sudo[98333]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:39 compute-1 sudo[98364]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:43:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:43:39 compute-1 sudo[98569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqdeuinwsgqmbsifqphiwnneofwcyitn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254219.5916777-299-262351852571636/AnsiballZ_stat.py'
Sep 30 17:43:39 compute-1 sudo[98569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:40 compute-1 sudo[98572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:43:40 compute-1 sudo[98572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:43:40 compute-1 sudo[98572]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:40 compute-1 python3.9[98571]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:40 compute-1 sudo[98569]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:40 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c001d40 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:40 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:40.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:40 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:43:40 compute-1 ceph-mon[75484]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:43:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000055s ======
Sep 30 17:43:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:40.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Sep 30 17:43:40 compute-1 sudo[98749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxeumblsaniasguravijcsvrfbkvunbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254220.4176478-315-255766467688641/AnsiballZ_slurp.py'
Sep 30 17:43:40 compute-1 sudo[98749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:41 compute-1 python3.9[98751]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Sep 30 17:43:41 compute-1 sudo[98749]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:42 compute-1 sshd-session[95928]: Connection closed by 192.168.122.30 port 39866
Sep 30 17:43:42 compute-1 sshd-session[95924]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:43:42 compute-1 systemd-logind[789]: Session 40 logged out. Waiting for processes to exit.
Sep 30 17:43:42 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Sep 30 17:43:42 compute-1 systemd[1]: session-40.scope: Consumed 21.473s CPU time.
Sep 30 17:43:42 compute-1 systemd-logind[789]: Removed session 40.
Sep 30 17:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40026d0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:42 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:42.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:42 compute-1 ceph-mon[75484]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:43:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:42.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:43 compute-1 sudo[98778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:43:43 compute-1 sudo[98778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:43:43 compute-1 sudo[98778]: pam_unix(sudo:session): session closed for user root
Sep 30 17:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:44 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c001d40 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:44 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:44.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:44 compute-1 ceph-mon[75484]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:43:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:43:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:43:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:46 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40026d0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:46 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:46.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:46 compute-1 ceph-mon[75484]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:43:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:46.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:47 compute-1 sshd-session[98807]: Accepted publickey for zuul from 192.168.122.30 port 46792 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:43:47 compute-1 systemd-logind[789]: New session 41 of user zuul.
Sep 30 17:43:47 compute-1 systemd[1]: Started Session 41 of User zuul.
Sep 30 17:43:47 compute-1 sshd-session[98807]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:43:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174347 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:43:48 compute-1 python3.9[98960]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:48 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c0031d0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:48 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:48.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:48 compute-1 ceph-mon[75484]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:43:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:49 compute-1 python3.9[99116]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:43:49 compute-1 ceph-mon[75484]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:50 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40033e0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:50 compute-1 python3.9[99310]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:50 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:43:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:43:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:50.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:51 compute-1 sshd-session[98810]: Connection closed by 192.168.122.30 port 46792
Sep 30 17:43:51 compute-1 sshd-session[98807]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:43:51 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Sep 30 17:43:51 compute-1 systemd-logind[789]: Session 41 logged out. Waiting for processes to exit.
Sep 30 17:43:51 compute-1 systemd[1]: session-41.scope: Consumed 2.878s CPU time.
Sep 30 17:43:51 compute-1 systemd-logind[789]: Removed session 41.
Sep 30 17:43:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:52 compute-1 ceph-mon[75484]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:43:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:52 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c0031d0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:52 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:52.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:52.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:54 compute-1 ceph-mon[75484]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:54 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40033e0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:54 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:54.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:43:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:54.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:43:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:56 compute-1 sshd-session[99342]: Accepted publickey for zuul from 192.168.122.30 port 46796 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:43:56 compute-1 ceph-mon[75484]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:43:56 compute-1 systemd-logind[789]: New session 42 of user zuul.
Sep 30 17:43:56 compute-1 systemd[1]: Started Session 42 of User zuul.
Sep 30 17:43:56 compute-1 sshd-session[99342]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:56 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd99c0031d0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:56 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:56.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:56.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:57 compute-1 python3.9[99496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:58 compute-1 ceph-mon[75484]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:43:58 compute-1 python3.9[99651]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:58 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9b40033e0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:43:58 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:43:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:43:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:43:58.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:43:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:43:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:43:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:43:58.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:43:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:43:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:43:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:43:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:43:59 compute-1 sudo[99807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrcfwtvccmgsatleerxxjddatcniizpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254238.9136245-61-147987219216106/AnsiballZ_setup.py'
Sep 30 17:43:59 compute-1 sudo[99807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:43:59 compute-1 python3.9[99809]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:43:59 compute-1 sudo[99807]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:00 compute-1 sudo[99892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erqqqdbjkpddtshstrdkbvckizwjnhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254238.9136245-61-147987219216106/AnsiballZ_dnf.py'
Sep 30 17:44:00 compute-1 sudo[99892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:00 compute-1 sudo[99895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:44:00 compute-1 sudo[99895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:00 compute-1 sudo[99895]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:00 compute-1 python3.9[99894]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:44:00 compute-1 ceph-mon[75484]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:00 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002970 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:00 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:00.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:00.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:01 compute-1 sudo[99892]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:02 compute-1 sudo[100073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbyctctzjpieayresofwbuclksvcyvao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254241.891658-85-69274945382029/AnsiballZ_setup.py'
Sep 30 17:44:02 compute-1 sudo[100073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:02 compute-1 ceph-mon[75484]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:44:02 compute-1 python3.9[100075]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:02 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002690 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:02 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:02.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:02 compute-1 sudo[100073]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:02.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:03 compute-1 sudo[100269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvqkwapzvloactexksnozkzyodxonuqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254243.2064695-107-87480179352768/AnsiballZ_file.py'
Sep 30 17:44:03 compute-1 sudo[100269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:03 compute-1 python3.9[100271]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:03 compute-1 sudo[100269]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:04 compute-1 ceph-mon[75484]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:44:04 compute-1 sudo[100423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kluiggmbmsavihryshsozrdnjmhfpxxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254244.16217-123-72555175123028/AnsiballZ_command.py'
Sep 30 17:44:04 compute-1 sudo[100423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:04 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002970 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:04 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:04.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:04 compute-1 python3.9[100425]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:44:04 compute-1 sudo[100423]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:04.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:05 compute-1 sudo[100588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjienjtzeryixqafqtjreqejslfkznye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254245.1073723-139-176083477470706/AnsiballZ_stat.py'
Sep 30 17:44:05 compute-1 sudo[100588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174405 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:44:05 compute-1 python3.9[100590]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:05 compute-1 sudo[100588]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:06 compute-1 sudo[100667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrdackdkshjjmxfrxerjxvmhragarurk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254245.1073723-139-176083477470706/AnsiballZ_file.py'
Sep 30 17:44:06 compute-1 sudo[100667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:06 compute-1 python3.9[100669]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:06 compute-1 sudo[100667]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:06 compute-1 ceph-mon[75484]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:44:06 compute-1 sshd-session[71164]: Received disconnect from 38.102.83.36 port 59220:11: disconnected by user
Sep 30 17:44:06 compute-1 sshd-session[71164]: Disconnected from user zuul 38.102.83.36 port 59220
Sep 30 17:44:06 compute-1 sshd-session[71161]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:44:06 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Sep 30 17:44:06 compute-1 systemd[1]: session-20.scope: Consumed 9.778s CPU time.
Sep 30 17:44:06 compute-1 systemd-logind[789]: Session 20 logged out. Waiting for processes to exit.
Sep 30 17:44:06 compute-1 systemd-logind[789]: Removed session 20.
Sep 30 17:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:06 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002690 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:06 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:06 compute-1 sudo[100820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsyvntdjytwbpymjchcgopnhtsbrylau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254246.542669-163-58128650225346/AnsiballZ_stat.py'
Sep 30 17:44:06 compute-1 sudo[100820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:06.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:07 compute-1 python3.9[100822]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:07 compute-1 sudo[100820]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:07 compute-1 sudo[100898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adtedohidgwdalqmwbhhiginfhdiffqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254246.542669-163-58128650225346/AnsiballZ_file.py'
Sep 30 17:44:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:44:07 compute-1 sudo[100898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:07 compute-1 python3.9[100900]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:07 compute-1 sudo[100898]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:08 compute-1 sudo[101051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkvhdchtqwsqvdnsmhgzvfecjwujdjvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254247.890282-189-154579262381778/AnsiballZ_ini_file.py'
Sep 30 17:44:08 compute-1 sudo[101051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:08 compute-1 ceph-mon[75484]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:44:08 compute-1 python3.9[101053]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:08 compute-1 sudo[101051]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:08 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd998002af0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:08 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9a4003eb0 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:08.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:09 compute-1 sudo[101206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vldzzvqeokdftgldcsuhluhfojdvwryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254248.7837446-189-247959764465619/AnsiballZ_ini_file.py'
Sep 30 17:44:09 compute-1 sudo[101206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:09 compute-1 python3.9[101208]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:09 compute-1 sudo[101206]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:09 compute-1 sudo[101358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfbvojmaxtkmshjdcxyfwrjkrdaqlreo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254249.4839013-189-214975313004489/AnsiballZ_ini_file.py'
Sep 30 17:44:09 compute-1 sudo[101358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:09 compute-1 python3.9[101360]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:09 compute-1 sshd-session[101152]: Invalid user ubuntu from 194.107.115.65 port 65438
Sep 30 17:44:09 compute-1 sudo[101358]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:10 compute-1 sshd-session[101152]: Received disconnect from 194.107.115.65 port 65438:11: Bye Bye [preauth]
Sep 30 17:44:10 compute-1 sshd-session[101152]: Disconnected from invalid user ubuntu 194.107.115.65 port 65438 [preauth]
Sep 30 17:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:10 compute-1 sudo[101511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atqqhmcyzhasfcjcbbffpqtqqafihxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254250.0949166-189-126920532926908/AnsiballZ_ini_file.py'
Sep 30 17:44:10 compute-1 sudo[101511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:10 compute-1 ceph-mon[75484]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:10 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd98c002690 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:10 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd990004140 fd 46 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:10 compute-1 python3.9[101513]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:10 compute-1 sudo[101511]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:10.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:11 compute-1 sudo[101664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siweamjrhidgbjhbazrdkomljljnbpbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254250.994036-251-142865463337429/AnsiballZ_dnf.py'
Sep 30 17:44:11 compute-1 sudo[101664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:11 compute-1 python3.9[101666]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:44:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:12 compute-1 ceph-mon[75484]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:44:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[92266]: 30/09/2025 17:44:12 : epoch 68dc1692 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9980021f0 fd 46 proxy ignored for local
Sep 30 17:44:12 compute-1 kernel: ganesha.nfsd[99657]: segfault at 50 ip 00007fda6dc7832e sp 00007fda24ff8210 error 4 in libntirpc.so.5.8[7fda6dc5d000+2c000] likely on CPU 1 (core 0, socket 1)
Sep 30 17:44:12 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:44:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:12 compute-1 systemd[1]: Started Process Core Dump (PID 101670/UID 0).
Sep 30 17:44:12 compute-1 sudo[101664]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:12.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:13 compute-1 sudo[101821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnzdobaynqispmfhvhfuqgzxespyxqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254253.326527-273-254534195935525/AnsiballZ_setup.py'
Sep 30 17:44:13 compute-1 sudo[101821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:13 compute-1 systemd-coredump[101671]: Process 92270 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007fda6dc7832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:44:14 compute-1 python3.9[101823]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:44:14 compute-1 systemd[1]: systemd-coredump@1-101670-0.service: Deactivated successfully.
Sep 30 17:44:14 compute-1 systemd[1]: systemd-coredump@1-101670-0.service: Consumed 1.251s CPU time.
Sep 30 17:44:14 compute-1 sudo[101821]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:14 compute-1 podman[101831]: 2025-09-30 17:44:14.146396779 +0000 UTC m=+0.056172174 container died e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True)
Sep 30 17:44:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-1a3a3ab60e4f721510f7239e702c8a4f7bbf12660115cd8d2cbc8ccde350e2a5-merged.mount: Deactivated successfully.
Sep 30 17:44:14 compute-1 podman[101831]: 2025-09-30 17:44:14.195918041 +0000 UTC m=+0.105693466 container remove e239b9f599d6aa093aaf80ceef008a5607dd45fd7476f95a15c7374264623d9f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:44:14 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:44:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:14 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:44:14 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.717s CPU time.
Sep 30 17:44:14 compute-1 ceph-mon[75484]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:44:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:14.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:14 compute-1 sudo[102025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czdszodyumcdwoxvmsaxxaqooxirlbbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254254.2940285-289-270325382838053/AnsiballZ_stat.py'
Sep 30 17:44:14 compute-1 sudo[102025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:14 compute-1 python3.9[102027]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:44:14 compute-1 sudo[102025]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:14.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:15 compute-1 sudo[102177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibebgpxfjijpbpsxbuabknavigylwuuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254255.199775-307-278678216357808/AnsiballZ_stat.py'
Sep 30 17:44:15 compute-1 sudo[102177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:15 compute-1 python3.9[102179]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:44:15 compute-1 sudo[102177]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:16 compute-1 ceph-mon[75484]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:44:16 compute-1 sudo[102331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvcdkaznqpfmqtcpmtdzmhgabyaeohrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254256.172248-327-112845906086111/AnsiballZ_service_facts.py'
Sep 30 17:44:16 compute-1 sudo[102331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:16 compute-1 python3.9[102333]: ansible-service_facts Invoked
Sep 30 17:44:16 compute-1 network[102350]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:44:16 compute-1 network[102351]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:44:16 compute-1 network[102352]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:44:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:16.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:18 compute-1 ceph-mon[75484]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174418 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/174418 (4) : backend 'backend' has no server available!
Sep 30 17:44:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:18.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:18.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:20 compute-1 sudo[102331]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:20 compute-1 sudo[102477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:44:20 compute-1 sudo[102477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:20 compute-1 sudo[102477]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:20 compute-1 ceph-mon[75484]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Sep 30 17:44:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:20.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:21 compute-1 sudo[102667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxujcvynwstwogwmyqysmxzpgnvaizon ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759254260.891995-353-121314200423307/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759254260.891995-353-121314200423307/args'
Sep 30 17:44:21 compute-1 sudo[102667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:21 compute-1 sudo[102667]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:21 compute-1 sudo[102835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpejxnragyzmzehmzzfytsuojatqsmfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254261.6467793-375-18206340934038/AnsiballZ_dnf.py'
Sep 30 17:44:21 compute-1 sudo[102835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:22 compute-1 python3.9[102837]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:44:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:22 compute-1 ceph-mon[75484]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Sep 30 17:44:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:44:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:22.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:23 compute-1 sudo[102835]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.753093) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263753159, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2750, "num_deletes": 251, "total_data_size": 6584278, "memory_usage": 6664944, "flush_reason": "Manual Compaction"}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263775649, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4272896, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8072, "largest_seqno": 10817, "table_properties": {"data_size": 4261437, "index_size": 7249, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 26352, "raw_average_key_size": 21, "raw_value_size": 4237259, "raw_average_value_size": 3411, "num_data_blocks": 321, "num_entries": 1242, "num_filter_entries": 1242, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254086, "oldest_key_time": 1759254086, "file_creation_time": 1759254263, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 22608 microseconds, and 13633 cpu microseconds.
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.775702) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4272896 bytes OK
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.775725) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.777467) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.777484) EVENT_LOG_v1 {"time_micros": 1759254263777478, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.777504) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 6571521, prev total WAL file size 6571521, number of live WAL files 2.
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.778948) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4172KB)], [18(10MB)]
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263779034, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 15367918, "oldest_snapshot_seqno": -1}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3871 keys, 12225450 bytes, temperature: kUnknown
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263880691, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 12225450, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12194028, "index_size": 20624, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 98292, "raw_average_key_size": 25, "raw_value_size": 12117773, "raw_average_value_size": 3130, "num_data_blocks": 889, "num_entries": 3871, "num_filter_entries": 3871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254263, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.880986) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 12225450 bytes
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.882468) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.0 rd, 120.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 10.6 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(6.5) write-amplify(2.9) OK, records in: 4394, records dropped: 523 output_compression: NoCompression
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.882492) EVENT_LOG_v1 {"time_micros": 1759254263882480, "job": 8, "event": "compaction_finished", "compaction_time_micros": 101742, "compaction_time_cpu_micros": 43229, "output_level": 6, "num_output_files": 1, "total_output_size": 12225450, "num_input_records": 4394, "num_output_records": 3871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263883473, "job": 8, "event": "table_file_deletion", "file_number": 20}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254263885799, "job": 8, "event": "table_file_deletion", "file_number": 18}
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.778858) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.885946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.885954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.885957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.885960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:23 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:44:23.885963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:24 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 2.
Sep 30 17:44:24 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:44:24 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.717s CPU time.
Sep 30 17:44:24 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:44:24 compute-1 sudo[103003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djdqnpabhnmhgfdeknasejfkzbryvlic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254263.9653661-401-44282909022611/AnsiballZ_package_facts.py'
Sep 30 17:44:24 compute-1 sudo[103003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:24 compute-1 ceph-mon[75484]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Sep 30 17:44:24 compute-1 podman[103043]: 2025-09-30 17:44:24.864642066 +0000 UTC m=+0.075631016 container create 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default)
Sep 30 17:44:24 compute-1 python3.9[103006]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 17:44:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cdb494aff4269d8448673347d85a191ca615589c551f95c16be047ac11358116/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cdb494aff4269d8448673347d85a191ca615589c551f95c16be047ac11358116/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cdb494aff4269d8448673347d85a191ca615589c551f95c16be047ac11358116/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cdb494aff4269d8448673347d85a191ca615589c551f95c16be047ac11358116/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:24 compute-1 podman[103043]: 2025-09-30 17:44:24.835776018 +0000 UTC m=+0.046765048 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:44:24 compute-1 podman[103043]: 2025-09-30 17:44:24.946980304 +0000 UTC m=+0.157969334 container init 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid)
Sep 30 17:44:24 compute-1 podman[103043]: 2025-09-30 17:44:24.965547831 +0000 UTC m=+0.176536811 container start 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:44:24 compute-1 bash[103043]: 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a
Sep 30 17:44:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:44:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:24.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:44:24 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:24 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:24 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:44:25 compute-1 sudo[103003]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:25 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:26 compute-1 sudo[103251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptwaynuafezvltfuwyeqshomdjmjixkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254265.6876683-421-269682390636334/AnsiballZ_stat.py'
Sep 30 17:44:26 compute-1 sudo[103251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:26 compute-1 python3.9[103253]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:26 compute-1 sudo[103251]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:26 compute-1 sudo[103330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljhxjojuqmonveyssamzqqsflyvpujbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254265.6876683-421-269682390636334/AnsiballZ_file.py'
Sep 30 17:44:26 compute-1 sudo[103330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:26 compute-1 python3.9[103333]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:26 compute-1 ceph-mon[75484]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 1 op/s
Sep 30 17:44:26 compute-1 sudo[103330]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:27 compute-1 sudo[103484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olvbtusggpnxyvdsvqhykggldijlixgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254267.027867-445-82323278821425/AnsiballZ_stat.py'
Sep 30 17:44:27 compute-1 sudo[103484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:27 compute-1 python3.9[103486]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:27 compute-1 sudo[103484]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:27 compute-1 sshd-session[103302]: Invalid user postgres from 80.94.95.116 port 51382
Sep 30 17:44:27 compute-1 sudo[103562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plveocwpygeklffbbwjkaahpelhcrsxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254267.027867-445-82323278821425/AnsiballZ_file.py'
Sep 30 17:44:27 compute-1 sudo[103562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:28 compute-1 sshd-session[103302]: Connection closed by invalid user postgres 80.94.95.116 port 51382 [preauth]
Sep 30 17:44:28 compute-1 python3.9[103564]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:28 compute-1 sudo[103562]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:28 compute-1 ceph-mon[75484]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 170 B/s wr, 0 op/s
Sep 30 17:44:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:29 compute-1 sudo[103716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlvkokdyqixuinhdnnzrelejletgiwgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254268.8156326-482-80124162912029/AnsiballZ_lineinfile.py'
Sep 30 17:44:29 compute-1 sudo[103716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:29 compute-1 python3.9[103718]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:29 compute-1 sudo[103716]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:29 compute-1 ceph-mon[75484]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Sep 30 17:44:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:30 compute-1 sudo[103869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtrjjlvvjlixbqvujtvfydbmflvmfmim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254270.225116-511-185516697991165/AnsiballZ_setup.py'
Sep 30 17:44:30 compute-1 sudo[103869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:30.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:30 compute-1 python3.9[103872]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:44:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:30.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:31 compute-1 sudo[103869]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:31 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:44:31 compute-1 sudo[103954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhsqozeewlqsrwjvllvbyvcscfbvfwki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254270.225116-511-185516697991165/AnsiballZ_systemd.py'
Sep 30 17:44:31 compute-1 sudo[103954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:31 compute-1 python3.9[103956]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:44:32 compute-1 sudo[103954]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:32 compute-1 ceph-mon[75484]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:44:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:32.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:32 compute-1 sshd-session[99345]: Connection closed by 192.168.122.30 port 46796
Sep 30 17:44:32 compute-1 sshd-session[99342]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:44:32 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Sep 30 17:44:32 compute-1 systemd[1]: session-42.scope: Consumed 28.092s CPU time.
Sep 30 17:44:32 compute-1 systemd-logind[789]: Session 42 logged out. Waiting for processes to exit.
Sep 30 17:44:32 compute-1 systemd-logind[789]: Removed session 42.
Sep 30 17:44:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:32.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174433 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:44:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:34 compute-1 ceph-mon[75484]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:44:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:34.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:34.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:36 compute-1 ceph-mon[75484]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 853 B/s wr, 2 op/s
Sep 30 17:44:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:36.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000007:nfs.cephfs.0: -2
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:37 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:44:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:44:37 compute-1 sshd-session[104002]: Accepted publickey for zuul from 192.168.122.30 port 51066 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:44:37 compute-1 systemd-logind[789]: New session 43 of user zuul.
Sep 30 17:44:37 compute-1 systemd[1]: Started Session 43 of User zuul.
Sep 30 17:44:37 compute-1 sshd-session[104002]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:38 compute-1 sudo[104156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwonhznthwpbucwgosmizwrapfizmscz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254277.7800872-25-14729840409167/AnsiballZ_file.py'
Sep 30 17:44:38 compute-1 sudo[104156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:38 compute-1 ceph-mon[75484]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 853 B/s wr, 2 op/s
Sep 30 17:44:38 compute-1 python3.9[104158]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:38 compute-1 sudo[104156]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:38 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa814000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:38 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa810001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:38.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:39 compute-1 sudo[104310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjovctzxsiydcxjbkpkhwclxdvoocmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254278.7994652-49-32086480993544/AnsiballZ_stat.py'
Sep 30 17:44:39 compute-1 sudo[104310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:39 compute-1 python3.9[104312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:39 compute-1 sudo[104310]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:39 compute-1 sudo[104388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmtnrmkbspvqrkaztxvudboylvdbneap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254278.7994652-49-32086480993544/AnsiballZ_file.py'
Sep 30 17:44:39 compute-1 sudo[104388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:40 compute-1 python3.9[104390]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:40 compute-1 sudo[104388]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:40 compute-1 sudo[104416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:44:40 compute-1 sudo[104416]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:40 compute-1 sudo[104416]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:40 compute-1 sshd-session[104005]: Connection closed by 192.168.122.30 port 51066
Sep 30 17:44:40 compute-1 sshd-session[104002]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:44:40 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Sep 30 17:44:40 compute-1 systemd[1]: session-43.scope: Consumed 2.073s CPU time.
Sep 30 17:44:40 compute-1 systemd-logind[789]: Session 43 logged out. Waiting for processes to exit.
Sep 30 17:44:40 compute-1 systemd-logind[789]: Removed session 43.
Sep 30 17:44:40 compute-1 ceph-mon[75484]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 2.0 KiB/s rd, 1023 B/s wr, 3 op/s
Sep 30 17:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:40 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa7f0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174440 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:40 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa808001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:40.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:40.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174441 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:42 compute-1 ceph-mon[75484]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Sep 30 17:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:42 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa804001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:42 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa810001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:44:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:42.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:43.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:44 compute-1 sudo[104445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:44:44 compute-1 sudo[104445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:44 compute-1 sudo[104445]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:44 compute-1 sudo[104470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:44:44 compute-1 sudo[104470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:44 compute-1 ceph-mon[75484]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Sep 30 17:44:44 compute-1 kernel: ganesha.nfsd[103991]: segfault at 50 ip 00007fa8c1b0432e sp 00007fa890ff8210 error 4 in libntirpc.so.5.8[7fa8c1ae9000+2c000] likely on CPU 1 (core 0, socket 1)
Sep 30 17:44:44 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:44:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[103058]: 30/09/2025 17:44:44 : epoch 68dc16f8 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa810001c40 fd 38 proxy ignored for local
Sep 30 17:44:44 compute-1 systemd[1]: Started Process Core Dump (PID 104527/UID 0).
Sep 30 17:44:44 compute-1 sudo[104470]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:44.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:45.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:44:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:44:45 compute-1 sshd-session[104529]: Accepted publickey for zuul from 192.168.122.30 port 51070 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:44:45 compute-1 systemd-logind[789]: New session 44 of user zuul.
Sep 30 17:44:45 compute-1 systemd[1]: Started Session 44 of User zuul.
Sep 30 17:44:45 compute-1 sshd-session[104529]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:44:45 compute-1 systemd-coredump[104528]: Process 103062 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 42:
                                                    #0  0x00007fa8c1b0432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:44:46 compute-1 systemd[1]: systemd-coredump@2-104527-0.service: Deactivated successfully.
Sep 30 17:44:46 compute-1 systemd[1]: systemd-coredump@2-104527-0.service: Consumed 1.280s CPU time.
Sep 30 17:44:46 compute-1 podman[104590]: 2025-09-30 17:44:46.124376897 +0000 UTC m=+0.043043546 container died 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:44:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-cdb494aff4269d8448673347d85a191ca615589c551f95c16be047ac11358116-merged.mount: Deactivated successfully.
Sep 30 17:44:46 compute-1 systemd[83023]: Created slice User Background Tasks Slice.
Sep 30 17:44:46 compute-1 systemd[83023]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 17:44:46 compute-1 podman[104590]: 2025-09-30 17:44:46.178935557 +0000 UTC m=+0.097602206 container remove 298774df72e08b6886a3b4cad2eab89691fab76a48af9c3b6107cbd5b651933a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Sep 30 17:44:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:44:46 compute-1 systemd[83023]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 17:44:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:44:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.835s CPU time.
Sep 30 17:44:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:46 compute-1 ceph-mon[75484]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Sep 30 17:44:46 compute-1 python3.9[104734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:44:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:47.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:47 compute-1 sudo[104888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opfuormlhujrgfarwkvgclcwruqnpwph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254287.43186-47-103828306131009/AnsiballZ_file.py'
Sep 30 17:44:47 compute-1 sudo[104888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:48 compute-1 python3.9[104890]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:48 compute-1 sudo[104888]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:48 compute-1 sudo[105065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwcbkgwsxtklbmwtzhisyjkmjbpcnqgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254288.3385005-63-52551142125792/AnsiballZ_stat.py'
Sep 30 17:44:48 compute-1 sudo[105065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:48 compute-1 ceph-mon[75484]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Sep 30 17:44:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:49.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:49 compute-1 python3.9[105067]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:49 compute-1 sudo[105065]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:49 compute-1 sudo[105105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:44:49 compute-1 sudo[105105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:44:49 compute-1 sudo[105105]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:49 compute-1 sudo[105168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tepgjzzajinueqmcxdgxtcgxxojgymgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254288.3385005-63-52551142125792/AnsiballZ_file.py'
Sep 30 17:44:49 compute-1 sudo[105168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:49 compute-1 python3.9[105170]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xj_32duo recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:49 compute-1 sudo[105168]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:44:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:44:50 compute-1 ceph-mon[75484]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 255 B/s wr, 1 op/s
Sep 30 17:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:50 compute-1 sudo[105321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qycfiwraxycqktgcjhwwgfdbskjrsaad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254290.0299785-103-272500415125057/AnsiballZ_stat.py'
Sep 30 17:44:50 compute-1 sudo[105321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:50 compute-1 python3.9[105323]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:50 compute-1 sudo[105321]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174450 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/174450 (4) : backend 'backend' has no server available!
Sep 30 17:44:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:50.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:50 compute-1 sudo[105400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnvqehndfwiuuosvgxizyusuqiunbjtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254290.0299785-103-272500415125057/AnsiballZ_file.py'
Sep 30 17:44:50 compute-1 sudo[105400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:51.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:51 compute-1 python3.9[105402]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.wg8pylc5 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:51 compute-1 sudo[105400]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:51 compute-1 sudo[105552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vswerzzojveemizvtewhrhmvycppmcok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254291.4374692-129-34140793215722/AnsiballZ_file.py'
Sep 30 17:44:51 compute-1 sudo[105552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:52 compute-1 python3.9[105554]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:52 compute-1 sudo[105552]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:52 compute-1 ceph-mon[75484]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:44:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:44:52 compute-1 sudo[105705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idsdlimxqchhmumtgupabtxzisqylsqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254292.250952-145-2275606714145/AnsiballZ_stat.py'
Sep 30 17:44:52 compute-1 sudo[105705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:52.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:52 compute-1 python3.9[105708]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:52 compute-1 sudo[105705]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:53.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:53 compute-1 sudo[105784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaxwjbqpzshuikdpsplovjlbhoryjzof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254292.250952-145-2275606714145/AnsiballZ_file.py'
Sep 30 17:44:53 compute-1 sudo[105784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:53 compute-1 python3.9[105786]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:53 compute-1 sudo[105784]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:53 compute-1 sudo[105936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpwjyufdcebqzpxbywypdzunuzujhwck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254293.4894495-145-274633939191458/AnsiballZ_stat.py'
Sep 30 17:44:53 compute-1 sudo[105936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:54 compute-1 python3.9[105938]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:54 compute-1 sudo[105936]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:54 compute-1 sudo[106015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hawhzkijuzwdgupybvgygysdmdfzparf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254293.4894495-145-274633939191458/AnsiballZ_file.py'
Sep 30 17:44:54 compute-1 sudo[106015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:54 compute-1 ceph-mon[75484]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:44:54 compute-1 python3.9[106017]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:44:54 compute-1 sudo[106015]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:54.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:55 compute-1 sudo[106168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdnlnlbwsxbnivbvehwrqitgpbechnew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254294.6983776-191-8809948921306/AnsiballZ_file.py'
Sep 30 17:44:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:55.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:55 compute-1 sudo[106168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:55 compute-1 python3.9[106170]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:55 compute-1 sudo[106168]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:55 compute-1 sudo[106320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seqyksgsmytedbuyntmsiqgevetbosll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254295.4139388-207-120251686198127/AnsiballZ_stat.py'
Sep 30 17:44:55 compute-1 sudo[106320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:56 compute-1 python3.9[106322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:56 compute-1 sudo[106320]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:56 compute-1 sudo[106399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsuiqivvrakyqzbonhkptsoiehtlcibe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254295.4139388-207-120251686198127/AnsiballZ_file.py'
Sep 30 17:44:56 compute-1 sudo[106399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:56 compute-1 ceph-mon[75484]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 255 B/s wr, 0 op/s
Sep 30 17:44:56 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 3.
Sep 30 17:44:56 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:44:56 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.835s CPU time.
Sep 30 17:44:56 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:44:56 compute-1 python3.9[106401]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:56 compute-1 sudo[106399]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:56.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:56 compute-1 podman[106476]: 2025-09-30 17:44:56.821685149 +0000 UTC m=+0.073263434 container create ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:44:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e58da4cbfa33f58f7a87c79b79d8a99e7ca4096017057b21bd84b5f97fbd5a16/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:56 compute-1 podman[106476]: 2025-09-30 17:44:56.788515349 +0000 UTC m=+0.040093684 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:44:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e58da4cbfa33f58f7a87c79b79d8a99e7ca4096017057b21bd84b5f97fbd5a16/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e58da4cbfa33f58f7a87c79b79d8a99e7ca4096017057b21bd84b5f97fbd5a16/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e58da4cbfa33f58f7a87c79b79d8a99e7ca4096017057b21bd84b5f97fbd5a16/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:44:56 compute-1 podman[106476]: 2025-09-30 17:44:56.905084076 +0000 UTC m=+0.156662371 container init ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:44:56 compute-1 podman[106476]: 2025-09-30 17:44:56.921147016 +0000 UTC m=+0.172725301 container start ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 17:44:56 compute-1 bash[106476]: ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:44:56 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:44:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:44:57 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:44:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:44:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:57.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:44:57 compute-1 sudo[106657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjdmtnpudyznbsuphtaideqddpiofbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254296.7791297-231-218534166038449/AnsiballZ_stat.py'
Sep 30 17:44:57 compute-1 sudo[106657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:57 compute-1 python3.9[106659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:44:57 compute-1 sudo[106657]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:57 compute-1 sudo[106735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzvfuktmhfefjfaitwtfohgnwqdwbgsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254296.7791297-231-218534166038449/AnsiballZ_file.py'
Sep 30 17:44:57 compute-1 sudo[106735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:57 compute-1 python3.9[106737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:44:58 compute-1 sudo[106735]: pam_unix(sudo:session): session closed for user root
Sep 30 17:44:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:58 compute-1 ceph-mon[75484]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 255 B/s wr, 0 op/s
Sep 30 17:44:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:44:58.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:58 compute-1 sudo[106889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thyulhwrwvvzztvwisnhppfpfwbogsgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254298.2435203-255-52029154554655/AnsiballZ_systemd.py'
Sep 30 17:44:58 compute-1 sudo[106889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:44:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:44:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:44:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:44:59.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:44:59 compute-1 python3.9[106891]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:44:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:44:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:44:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:44:59 compute-1 systemd[1]: Reloading.
Sep 30 17:44:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:44:59 compute-1 systemd-rc-local-generator[106921]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:44:59 compute-1 systemd-sysv-generator[106926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:44:59 compute-1 sudo[106889]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:00 compute-1 sudo[107080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdvugpwvpmiobqqxjqyqdawawibydsym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254299.8718874-271-281228411540593/AnsiballZ_stat.py'
Sep 30 17:45:00 compute-1 sudo[107080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:00 compute-1 python3.9[107082]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:00 compute-1 sudo[107083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:45:00 compute-1 sudo[107080]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:00 compute-1 sudo[107083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:00 compute-1 sudo[107083]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:00 compute-1 ceph-mon[75484]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Sep 30 17:45:00 compute-1 sudo[107184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yenixhuifbrxlhtucoeclaxspmmxyuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254299.8718874-271-281228411540593/AnsiballZ_file.py'
Sep 30 17:45:00 compute-1 sudo[107184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:00.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:00 compute-1 python3.9[107186]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:00 compute-1 sudo[107184]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:01.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:01 compute-1 sudo[107336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyskbztmxrhodecnvlasbnhuomwjmxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254301.062973-295-267143850105536/AnsiballZ_stat.py'
Sep 30 17:45:01 compute-1 sudo[107336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:01 compute-1 python3.9[107338]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:01 compute-1 sudo[107336]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:01 compute-1 sudo[107414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioggqkqsipfkuctakyimrbstdqgmxdzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254301.062973-295-267143850105536/AnsiballZ_file.py'
Sep 30 17:45:01 compute-1 sudo[107414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:02 compute-1 python3.9[107416]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:02 compute-1 sudo[107414]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:02 compute-1 sudo[107567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yflgwkjmrqqnzuqbrsoeokbdrilwaoeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254302.2372503-319-14772332845714/AnsiballZ_systemd.py'
Sep 30 17:45:02 compute-1 sudo[107567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:02 compute-1 ceph-mon[75484]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:45:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:02 compute-1 python3.9[107569]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:45:02 compute-1 systemd[1]: Reloading.
Sep 30 17:45:02 compute-1 systemd-rc-local-generator[107596]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:45:02 compute-1 systemd-sysv-generator[107601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:45:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:03.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:45:03 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:45:03 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:45:03 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:45:03 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:03 compute-1 sudo[107567]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:04 compute-1 python3.9[107761]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:45:04 compute-1 network[107779]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:45:04 compute-1 network[107780]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:45:04 compute-1 network[107781]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:45:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:04 compute-1 ceph-mon[75484]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:45:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:04.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:05.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174505 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:45:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:06 compute-1 ceph-mon[75484]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.9 KiB/s rd, 1023 B/s wr, 3 op/s
Sep 30 17:45:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:06.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:07.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:45:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:08 compute-1 ceph-mon[75484]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 853 B/s wr, 2 op/s
Sep 30 17:45:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:09.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000009:nfs.cephfs.0: -2
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:09 compute-1 sudo[108062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpatqolvcbrdcwcunxpofbcmkqrvacoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254309.024941-371-213973519435267/AnsiballZ_stat.py'
Sep 30 17:45:09 compute-1 sudo[108062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:09 compute-1 python3.9[108064]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:09 compute-1 sudo[108062]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:09 compute-1 sudo[108140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eitllgimkbqgwxdwgspftkdrvhxxlpnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254309.024941-371-213973519435267/AnsiballZ_file.py'
Sep 30 17:45:09 compute-1 sudo[108140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:10 compute-1 python3.9[108142]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:10 compute-1 sudo[108140]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2204000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:10 compute-1 ceph-mon[75484]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Sep 30 17:45:10 compute-1 sudo[108296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfphoodigffycltxapaazxyanmxgbech ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254310.3614326-397-231180597210310/AnsiballZ_file.py'
Sep 30 17:45:10 compute-1 sudo[108296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:10 compute-1 python3.9[108298]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:11 compute-1 sudo[108296]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:11.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:11 compute-1 sudo[108448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulazlovjepziofxcwyrvurtxyzviltql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254311.19348-413-77671938311592/AnsiballZ_stat.py'
Sep 30 17:45:11 compute-1 sudo[108448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:11 compute-1 python3.9[108450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:11 compute-1 sudo[108448]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:11 compute-1 sudo[108527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxbylbuwneyhylsrkjqgnycotnazgei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254311.19348-413-77671938311592/AnsiballZ_file.py'
Sep 30 17:45:12 compute-1 sudo[108527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:12 compute-1 python3.9[108529]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:12 compute-1 sudo[108527]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174512 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:12 compute-1 ceph-mon[75484]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:45:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:13.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:13 compute-1 sudo[108680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqwmrygmctmmxftkufewxwxoyopenim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254312.5598948-443-6555749318054/AnsiballZ_timezone.py'
Sep 30 17:45:13 compute-1 sudo[108680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:13 compute-1 python3.9[108682]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 17:45:13 compute-1 systemd[1]: Starting Time & Date Service...
Sep 30 17:45:13 compute-1 systemd[1]: Started Time & Date Service.
Sep 30 17:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:14 compute-1 sudo[108680]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:14 compute-1 ceph-mon[75484]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:45:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:14.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:15.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:15 compute-1 sudo[108838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uplnifngwmzxljfajroifoifhsmjtibu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254314.791761-461-232836780602778/AnsiballZ_file.py'
Sep 30 17:45:15 compute-1 sudo[108838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:15 compute-1 python3.9[108840]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:15 compute-1 sudo[108838]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:15 compute-1 sudo[108993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzowesulowedaaeercmzcuaqdwbhaogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254315.5724607-477-48495528158453/AnsiballZ_stat.py'
Sep 30 17:45:15 compute-1 sudo[108993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:16 compute-1 python3.9[108995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:16 compute-1 sudo[108993]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:16 compute-1 sudo[109071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezicniknuwurmtxydywewavxqbeqacpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254315.5724607-477-48495528158453/AnsiballZ_file.py'
Sep 30 17:45:16 compute-1 sudo[109071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:16 compute-1 python3.9[109073]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:16.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:16 compute-1 sudo[109071]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:16 compute-1 ceph-mon[75484]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:45:16 compute-1 sshd-session[108937]: Received disconnect from 194.107.115.65 port 33410:11: Bye Bye [preauth]
Sep 30 17:45:16 compute-1 sshd-session[108937]: Disconnected from authenticating user root 194.107.115.65 port 33410 [preauth]
Sep 30 17:45:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:17.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:17 compute-1 sudo[109224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtyjxtrpeugaohguudqrrgotmiwdcxhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254316.9906437-501-191653393661902/AnsiballZ_stat.py'
Sep 30 17:45:17 compute-1 sudo[109224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:17 compute-1 python3.9[109227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:17 compute-1 sudo[109224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:17 compute-1 sudo[109303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnuspzrmqlmaojlmctaiayrnefimxotn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254316.9906437-501-191653393661902/AnsiballZ_file.py'
Sep 30 17:45:17 compute-1 sudo[109303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:18 compute-1 python3.9[109305]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zpqex55j recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:18 compute-1 sudo[109303]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:18.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:18 compute-1 ceph-mon[75484]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Sep 30 17:45:18 compute-1 sudo[109457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgrjxactdoxpduoxnriwsbfbutiwmsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254318.3756757-526-101028650116391/AnsiballZ_stat.py'
Sep 30 17:45:18 compute-1 sudo[109457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:19.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:19 compute-1 python3.9[109459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:19 compute-1 sudo[109457]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:19 compute-1 sudo[109535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fugyatlhxxhrnsrscudczmytbfdtlcqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254318.3756757-526-101028650116391/AnsiballZ_file.py'
Sep 30 17:45:19 compute-1 sudo[109535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:19 compute-1 python3.9[109537]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:19 compute-1 sudo[109535]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:20 compute-1 sudo[109694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzdaqmapdydxfxrchdfjskwkwtovdmtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254319.9789326-551-131885910523220/AnsiballZ_command.py'
Sep 30 17:45:20 compute-1 sudo[109694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:20 compute-1 sudo[109685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:45:20 compute-1 sudo[109685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:20 compute-1 sudo[109685]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:20 compute-1 python3.9[109711]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:20 compute-1 sudo[109694]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:20.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:20 compute-1 ceph-mon[75484]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Sep 30 17:45:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:21.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:21 compute-1 sudo[109867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jexhrebppwcafsscvtnmvnkqvduxmbmt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254320.9591348-567-152361609019384/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 17:45:21 compute-1 sudo[109867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:21 compute-1 python3[109869]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 17:45:21 compute-1 sudo[109867]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:21 compute-1 ceph-mon[75484]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:22 compute-1 sudo[110020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilpyjrkglmjgayaambxxvngrpoghmrfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254321.875297-583-192458442642228/AnsiballZ_stat.py'
Sep 30 17:45:22 compute-1 sudo[110020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:22 compute-1 python3.9[110022]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:22 compute-1 sudo[110020]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:22 compute-1 sudo[110099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijypggwrlzdriuubplbgwwpwxrliyvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254321.875297-583-192458442642228/AnsiballZ_file.py'
Sep 30 17:45:22 compute-1 sudo[110099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:22.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:45:22 compute-1 python3.9[110101]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:22 compute-1 sudo[110099]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:23.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:23 compute-1 sudo[110251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otelrrsuuncgbbzhoktnxjbjdcgnztnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254323.2591343-608-69799545603155/AnsiballZ_stat.py'
Sep 30 17:45:23 compute-1 sudo[110251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:23 compute-1 ceph-mon[75484]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:45:23 compute-1 python3.9[110253]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:23 compute-1 sudo[110251]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:24 compute-1 sudo[110330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-metandoxlujbbldbbowsvksiycubsvow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254323.2591343-608-69799545603155/AnsiballZ_file.py'
Sep 30 17:45:24 compute-1 sudo[110330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:24 compute-1 python3.9[110332]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:24 compute-1 sudo[110330]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:24.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:24 compute-1 sudo[110483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zydkomdszoacrobwturffunbmdhczdlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254324.630742-631-23804074409422/AnsiballZ_stat.py'
Sep 30 17:45:24 compute-1 sudo[110483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:25.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:25 compute-1 python3.9[110485]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:25 compute-1 sudo[110483]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:25 compute-1 sudo[110561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcadhvdktlmstakeytluadetotdpwqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254324.630742-631-23804074409422/AnsiballZ_file.py'
Sep 30 17:45:25 compute-1 sudo[110561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:25 compute-1 python3.9[110563]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:25 compute-1 sudo[110561]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:26 compute-1 ceph-mon[75484]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:45:26 compute-1 sudo[110714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrgzemezanihwjlbwgxmqbetdlkstqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254326.0621212-655-38624200041255/AnsiballZ_stat.py'
Sep 30 17:45:26 compute-1 sudo[110714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:26 compute-1 python3.9[110716]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:26 compute-1 sudo[110714]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:26.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:26 compute-1 sudo[110793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paamraglvtxjeuribiehaadkdneqfzez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254326.0621212-655-38624200041255/AnsiballZ_file.py'
Sep 30 17:45:26 compute-1 sudo[110793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:27.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:27 compute-1 python3.9[110795]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:27 compute-1 sudo[110793]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:27 compute-1 sshd-session[109225]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:45:27 compute-1 sshd-session[109225]: banner exchange: Connection from 125.76.228.194 port 52088: Connection timed out
Sep 30 17:45:27 compute-1 sudo[110945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zklmdpbrlnnpvqzhdrvmmynrjtuzlrwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254327.4627068-679-144787757629545/AnsiballZ_stat.py'
Sep 30 17:45:27 compute-1 sudo[110945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:28 compute-1 python3.9[110947]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:28 compute-1 sudo[110945]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:28 compute-1 ceph-mon[75484]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:28 compute-1 sudo[111024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evbxaxdwpwpjqxejfsjuzraeobglvkxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254327.4627068-679-144787757629545/AnsiballZ_file.py'
Sep 30 17:45:28 compute-1 sudo[111024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:28 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:28 compute-1 python3.9[111026]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:28 compute-1 sudo[111024]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:28.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:29.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:29 compute-1 sudo[111178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voogcufqkcsoyammqlhbkzckorpsdpbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254328.9877143-705-12498450746572/AnsiballZ_command.py'
Sep 30 17:45:29 compute-1 sudo[111178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:29 compute-1 python3.9[111180]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:45:29 compute-1 sudo[111178]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:30 compute-1 sudo[111334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owimvrhmhtgupubxujssyyeyszbqtmwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254329.8109684-721-120836769233393/AnsiballZ_blockinfile.py'
Sep 30 17:45:30 compute-1 sudo[111334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:30 compute-1 ceph-mon[75484]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:30 compute-1 python3.9[111336]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:30 compute-1 sudo[111334]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:30.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:31.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:31 compute-1 sudo[111487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpnbcxfdvkiocvrhxfqraxiitjhtcvcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254330.8077137-739-134565704906875/AnsiballZ_file.py'
Sep 30 17:45:31 compute-1 sudo[111487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:31 compute-1 python3.9[111489]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:31 compute-1 sudo[111487]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:32 compute-1 sudo[111640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqoqmydrorsjpegdwtggqrtguhyhhqbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254331.637701-739-134653398368434/AnsiballZ_file.py'
Sep 30 17:45:32 compute-1 sudo[111640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:32 compute-1 python3.9[111642]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:32 compute-1 sudo[111640]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:32 compute-1 ceph-mon[75484]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:32.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:33 compute-1 sudo[111793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coemjpsrxxyrtbjtedtjijnntgftfsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254332.478217-769-273624669060221/AnsiballZ_mount.py'
Sep 30 17:45:33 compute-1 sudo[111793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:33.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:33 compute-1 python3.9[111795]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 17:45:33 compute-1 sudo[111793]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:33 compute-1 sudo[111945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zirkgkmafqhfkmehpkzyciriptnkhqyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254333.492234-769-179455897560737/AnsiballZ_mount.py'
Sep 30 17:45:33 compute-1 sudo[111945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:33 compute-1 sshd-session[111946]: Invalid user ca from 107.172.146.104 port 43940
Sep 30 17:45:34 compute-1 sshd-session[111946]: Received disconnect from 107.172.146.104 port 43940:11: Bye Bye [preauth]
Sep 30 17:45:34 compute-1 sshd-session[111946]: Disconnected from invalid user ca 107.172.146.104 port 43940 [preauth]
Sep 30 17:45:34 compute-1 python3.9[111949]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 17:45:34 compute-1 sudo[111945]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:34 compute-1 ceph-mon[75484]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:34 compute-1 sshd-session[104532]: Connection closed by 192.168.122.30 port 51070
Sep 30 17:45:34 compute-1 sshd-session[104529]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:45:34 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Sep 30 17:45:34 compute-1 systemd[1]: session-44.scope: Consumed 37.501s CPU time.
Sep 30 17:45:34 compute-1 systemd-logind[789]: Session 44 logged out. Waiting for processes to exit.
Sep 30 17:45:34 compute-1 systemd-logind[789]: Removed session 44.
Sep 30 17:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:34.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:35.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:36 compute-1 ceph-mon[75484]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:36.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 17:45:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:37.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 17:45:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:38 compute-1 ceph-mon[75484]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:38.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:39.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:40 compute-1 sshd-session[111985]: Accepted publickey for zuul from 192.168.122.30 port 58004 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:45:40 compute-1 systemd-logind[789]: New session 45 of user zuul.
Sep 30 17:45:40 compute-1 systemd[1]: Started Session 45 of User zuul.
Sep 30 17:45:40 compute-1 sshd-session[111985]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:45:40 compute-1 ceph-mon[75484]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:40 compute-1 sudo[112018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:45:40 compute-1 sudo[112018]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:40 compute-1 sudo[112018]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:40.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:40 compute-1 sudo[112164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgosxcyoxyaviafxlyffkxhdjbixporj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254340.4745085-18-29658554047034/AnsiballZ_tempfile.py'
Sep 30 17:45:40 compute-1 sudo[112164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:41.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:41 compute-1 python3.9[112166]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 17:45:41 compute-1 sudo[112164]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:41 compute-1 sudo[112316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwdjojddanqvtxtppdwotevfjmtscane ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254341.4075315-42-218604758312392/AnsiballZ_stat.py'
Sep 30 17:45:41 compute-1 sudo[112316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:42 compute-1 python3.9[112318]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:45:42 compute-1 sudo[112316]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:42 compute-1 ceph-mon[75484]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:42 compute-1 sudo[112474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuavzbmasxezubvgxhlldmmyzyvjfekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254342.3026516-58-128345552607691/AnsiballZ_slurp.py'
Sep 30 17:45:42 compute-1 sudo[112474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:42.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:42 compute-1 python3.9[112476]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Sep 30 17:45:42 compute-1 sudo[112474]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:43.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:43 compute-1 sudo[112626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebtohvkqrfnsqbyfxlquuovjmpesitqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254343.160016-74-76792456824952/AnsiballZ_stat.py'
Sep 30 17:45:43 compute-1 sudo[112626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:43 compute-1 python3.9[112628]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.572xiv1u follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:45:43 compute-1 sudo[112626]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:44 compute-1 sudo[112752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqwosxrpohdnhwihnvbggiuykoensrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254343.160016-74-76792456824952/AnsiballZ_copy.py'
Sep 30 17:45:44 compute-1 sudo[112752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:44 compute-1 python3.9[112754]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.572xiv1u mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254343.160016-74-76792456824952/.source.572xiv1u _original_basename=.myupk9ql follow=False checksum=32661b263589debfc2b37628181d327f091429d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:44 compute-1 sudo[112752]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:44 compute-1 ceph-mon[75484]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:44 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 17:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22000013a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:44.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:45.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:45 compute-1 sudo[112907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzzrqzfuqvosftqvfxdlcirfzxnhgqpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254344.6543763-104-155419820193055/AnsiballZ_setup.py'
Sep 30 17:45:45 compute-1 sudo[112907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:45 compute-1 python3.9[112909]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:45:45 compute-1 sudo[112907]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:46 compute-1 sudo[113060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zespvswsorwxqvggnujzytbzhuzlqjyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254346.0099032-121-148329530064802/AnsiballZ_blockinfile.py'
Sep 30 17:45:46 compute-1 sudo[113060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:46 compute-1 ceph-mon[75484]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:46 compute-1 python3.9[113062]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCc3938ID3mnGjsgZen6kQCNM5mkVWqANJocuRy3sXOMN63dtyjhBgL73dvNVc4/MHyzxPDQuzK8tshXHOcqwYvNyllWa9UhEuAdhcNXcRKSELVxmBLRWZx/tsxp7Ws5/jqm87BYWYBOH23DCI96hjzPNZvDj8g24u1gnFFIDlGQELa7bj3YLXw2mWWadQeLxX35z9zMP39YZLf/2F8zAFy27zfi5U4Ni1I6YXvTL+DNwg7Ulluud4fY+sf3ds4pU5htK63pEPYw1f4eI/82wYgnmmEjUqBXGUraTbHG7EoY0kg8bnebUO02l1uSbV+YM/5LNKomXhUy/kUhb9l1uqNuqXvimRH4xVgJ9Mn0cJ2WGhlnkU1gqx0p1FNE01EWx7Spbz4uwVESHAmr67aymcw0Da5R+P9sI5lMqVNJHUeQiAq9bA3X9EbU9oIBIzoZCm7x5N8UpcvzrK0tNMaVLymDnsI8Rkc1MJpuTboQqnsrWs1q2SxaKY2vfqidEBk+Xs=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILy+MaglT2Cqq/Z1fTckQQdU2y2eh3D3Okv7pfMd4ZvV
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKG527neZJvIIF0UdmoBKFMIwvlh64Ua1Pir0KM8tM5Fy8tZbjiOY/Dz3agm+i5OWkd7fXEaYOfPR/rFSi9+L8s=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/5i5Z2213BrqRzlkXUPKi2V6ft/sZowq7ddp+Dq/QqjnkediXByZsJmLLkuA+smrhUZwo5vyubq2HeUmZ3U1EenWRQdC11cJP4ll9+UV0iP2vlc4cMh+DV62ujsM8T15I5/7JnPBcvrGrTJTmpQSQoCm2yD2q/v4Kx2V27sLj8ZlX64zDSBOYy+KhjhBuUM4gEbyrRzO2PqNsMeDrDGr3QFiyGNe8qS2KHmuEa4QFJnumNPJrxYBdTjcsKMZOeuVw2a33JPia0kDgKtaNDV7Izq8h9DlidYk1/aPo6MhfwzYDkRUaKSVhqM1oEDQWc40AK7EX4S00KLr5Nix8bd2nqEZsbD5lk/6wKNR1xdutyZt0GcnOEVJB7+VWN6Y3COvwe9Q1GSKCAhMthkn0Vd9ZvrwiFVKpMUyWD1b74vjHcDu8UOcJlVoqol0jJYEqDCy6mRh0l4Q2PfmyFpVMJ1ib1hV4dPIfzJIkuON6jMedqsKPGZnio8U1E/EMWBlaVn8=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICeWw7E9xsgcxKn1cBOcDfvvFIX4M5Blc+gMQNI96O43
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG/CY8xOSIOy2V9qTWOkLlPGEg36qW1s4MO9P37ZVKfdA8ded8m++iIKGFCGxQiTUk0W+13bPq0LIPsJgw+4osM=
                                              create=True mode=0644 path=/tmp/ansible.572xiv1u state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:46 compute-1 sudo[113060]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22000013a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:46.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:47.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:47 compute-1 sudo[113213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjetbxnatogwhvnboyxqlgvvgkuwckfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254346.909851-137-10162083891584/AnsiballZ_command.py'
Sep 30 17:45:47 compute-1 sudo[113213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:47 compute-1 python3.9[113215]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.572xiv1u' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:45:47 compute-1 sudo[113213]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:48 compute-1 sudo[113368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knmraczkwiryasqauhodsrwzudipakwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254347.870021-153-231677044835049/AnsiballZ_file.py'
Sep 30 17:45:48 compute-1 sudo[113368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:48 compute-1 python3.9[113370]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.572xiv1u state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:45:48 compute-1 ceph-mon[75484]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:48 compute-1 sudo[113368]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:48.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:49 compute-1 sshd-session[111988]: Connection closed by 192.168.122.30 port 58004
Sep 30 17:45:49 compute-1 sshd-session[111985]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:45:49 compute-1 systemd-logind[789]: Session 45 logged out. Waiting for processes to exit.
Sep 30 17:45:49 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Sep 30 17:45:49 compute-1 systemd[1]: session-45.scope: Consumed 6.668s CPU time.
Sep 30 17:45:49 compute-1 systemd-logind[789]: Removed session 45.
Sep 30 17:45:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:49.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:49 compute-1 sudo[113396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:45:49 compute-1 sudo[113396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:49 compute-1 sudo[113396]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:49 compute-1 sudo[113421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:45:49 compute-1 sudo[113421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:50 compute-1 sudo[113421]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:50 compute-1 ceph-mon[75484]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:45:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200002330 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:50.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:51.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:52 compute-1 ceph-mon[75484]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:52.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:53.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:54 compute-1 sshd-session[113483]: Accepted publickey for zuul from 192.168.122.30 port 44104 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:45:54 compute-1 systemd-logind[789]: New session 46 of user zuul.
Sep 30 17:45:54 compute-1 systemd[1]: Started Session 46 of User zuul.
Sep 30 17:45:54 compute-1 sshd-session[113483]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:54 compute-1 ceph-mon[75484]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:54 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:45:54 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200002330 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:54 compute-1 sudo[113540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:45:54 compute-1 sudo[113540]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:45:54 compute-1 sudo[113540]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:54.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:55.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:55 compute-1 python3.9[113662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:56 compute-1 sudo[113817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmmodlejvrihvveegdggdiphzwwkjjwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254355.9498754-45-271538834273286/AnsiballZ_systemd.py'
Sep 30 17:45:56 compute-1 sudo[113817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:56 compute-1 ceph-mon[75484]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:56.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:56 compute-1 python3.9[113819]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 17:45:56 compute-1 sudo[113817]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:45:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:57.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:45:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:57 compute-1 sudo[113973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfzmmjnpsrvzxohiatwnpbnlhrkqkfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254357.1557634-61-226344774514796/AnsiballZ_systemd.py'
Sep 30 17:45:57 compute-1 sudo[113973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:57 compute-1 python3.9[113975]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:45:57 compute-1 sudo[113973]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:58 compute-1 sudo[114128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hapknerwhuwkptvakjsapifsmvmepdnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254358.1647255-79-79390245934455/AnsiballZ_command.py'
Sep 30 17:45:58 compute-1 sudo[114128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:58 compute-1 ceph-mon[75484]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:45:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200002330 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:45:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:45:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:45:58.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:45:58 compute-1 python3.9[114130]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:45:58 compute-1 sudo[114128]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:45:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:45:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:45:59.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:45:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:45:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:45:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:45:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:45:59 compute-1 sudo[114282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjiqwvvzndevsjvxrqncrkauowcvqmdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254359.126717-95-176344977601192/AnsiballZ_stat.py'
Sep 30 17:45:59 compute-1 sudo[114282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:45:59 compute-1 python3.9[114284]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:45:59 compute-1 sudo[114282]: pam_unix(sudo:session): session closed for user root
Sep 30 17:45:59 compute-1 sshd-session[113970]: Invalid user infra from 101.126.25.120 port 57448
Sep 30 17:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:00 compute-1 sudo[114435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iceqsvrwvvnofzoxxnjfflnmrdpxfdke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254360.0414708-113-224654981119153/AnsiballZ_file.py'
Sep 30 17:46:00 compute-1 sudo[114435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:00 compute-1 sudo[114439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:46:00 compute-1 sudo[114439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:46:00 compute-1 sudo[114439]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:00 compute-1 ceph-mon[75484]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:00 compute-1 python3.9[114438]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:00 compute-1 sudo[114435]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:00 compute-1 sshd-session[113970]: Received disconnect from 101.126.25.120 port 57448:11: Bye Bye [preauth]
Sep 30 17:46:00 compute-1 sshd-session[113970]: Disconnected from invalid user infra 101.126.25.120 port 57448 [preauth]
Sep 30 17:46:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:00.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:01.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:01 compute-1 sshd-session[113486]: Connection closed by 192.168.122.30 port 44104
Sep 30 17:46:01 compute-1 sshd-session[113483]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:46:01 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Sep 30 17:46:01 compute-1 systemd[1]: session-46.scope: Consumed 5.150s CPU time.
Sep 30 17:46:01 compute-1 systemd-logind[789]: Session 46 logged out. Waiting for processes to exit.
Sep 30 17:46:01 compute-1 systemd-logind[789]: Removed session 46.
Sep 30 17:46:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:02 compute-1 sshd-session[114445]: Invalid user pz from 84.51.43.58 port 49976
Sep 30 17:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:02 compute-1 sshd-session[114445]: Received disconnect from 84.51.43.58 port 49976:11: Bye Bye [preauth]
Sep 30 17:46:02 compute-1 sshd-session[114445]: Disconnected from invalid user pz 84.51.43.58 port 49976 [preauth]
Sep 30 17:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200003730 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:02 compute-1 ceph-mon[75484]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:02.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:03.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:04 compute-1 ceph-mon[75484]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:04.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:05.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:06 compute-1 sshd-session[114496]: Accepted publickey for zuul from 192.168.122.30 port 45258 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:46:06 compute-1 systemd-logind[789]: New session 47 of user zuul.
Sep 30 17:46:06 compute-1 systemd[1]: Started Session 47 of User zuul.
Sep 30 17:46:06 compute-1 sshd-session[114496]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200003730 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:06 compute-1 ceph-mon[75484]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:06.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:07.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:07 compute-1 python3.9[114650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:46:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:46:08 compute-1 sshd-session[114655]: Invalid user dani from 167.172.43.167 port 52878
Sep 30 17:46:08 compute-1 sshd-session[114655]: Received disconnect from 167.172.43.167 port 52878:11: Bye Bye [preauth]
Sep 30 17:46:08 compute-1 sshd-session[114655]: Disconnected from invalid user dani 167.172.43.167 port 52878 [preauth]
Sep 30 17:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:08 compute-1 sudo[114807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrfeqnvppiootctlpfvxkqkhrmkwneja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254368.0054116-49-236604832605784/AnsiballZ_setup.py'
Sep 30 17:46:08 compute-1 sudo[114807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:08 compute-1 python3.9[114809]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d40032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:08 compute-1 ceph-mon[75484]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:08.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:08 compute-1 sudo[114807]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:09.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:09 compute-1 sudo[114892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtmljnpikzbfaizmxosxgpeuysgoixhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254368.0054116-49-236604832605784/AnsiballZ_dnf.py'
Sep 30 17:46:09 compute-1 sudo[114892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:09 compute-1 python3.9[114894]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 17:46:09 compute-1 ceph-mon[75484]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:10 compute-1 sudo[114892]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004440 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:10.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:11 compute-1 python3.9[115047]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:12 compute-1 ceph-mon[75484]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:12.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:12 compute-1 python3.9[115201]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:46:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:13.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:13 compute-1 python3.9[115351]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:14 compute-1 ceph-mon[75484]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:14 compute-1 python3.9[115502]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:14.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:15 compute-1 sshd-session[114499]: Connection closed by 192.168.122.30 port 45258
Sep 30 17:46:15 compute-1 sshd-session[114496]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:46:15 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Sep 30 17:46:15 compute-1 systemd[1]: session-47.scope: Consumed 7.156s CPU time.
Sep 30 17:46:15 compute-1 systemd-logind[789]: Session 47 logged out. Waiting for processes to exit.
Sep 30 17:46:15 compute-1 systemd-logind[789]: Removed session 47.
Sep 30 17:46:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:15.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:15 compute-1 sshd-session[114494]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:46:15 compute-1 sshd-session[114494]: banner exchange: Connection from 121.229.191.90 port 59182: Connection timed out
Sep 30 17:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:16 compute-1 ceph-mon[75484]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:16.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:17.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:18 compute-1 ceph-mon[75484]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:18.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:19.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:20 compute-1 sshd-session[115532]: Received disconnect from 103.153.190.105 port 56361:11: Bye Bye [preauth]
Sep 30 17:46:20 compute-1 sshd-session[115532]: Disconnected from authenticating user root 103.153.190.105 port 56361 [preauth]
Sep 30 17:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:20 compute-1 sshd-session[115536]: Accepted publickey for zuul from 192.168.122.30 port 48626 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:46:20 compute-1 systemd-logind[789]: New session 48 of user zuul.
Sep 30 17:46:20 compute-1 systemd[1]: Started Session 48 of User zuul.
Sep 30 17:46:20 compute-1 sshd-session[115536]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:46:20 compute-1 ceph-mon[75484]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:20 compute-1 sudo[115593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:46:20 compute-1 sudo[115593]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:46:20 compute-1 sudo[115593]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:20.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:21.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:21 compute-1 python3.9[115717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:46:21 compute-1 sshd-session[115623]: Invalid user ramin from 194.107.115.65 port 57880
Sep 30 17:46:22 compute-1 sshd-session[115623]: Received disconnect from 194.107.115.65 port 57880:11: Bye Bye [preauth]
Sep 30 17:46:22 compute-1 sshd-session[115623]: Disconnected from invalid user ramin 194.107.115.65 port 57880 [preauth]
Sep 30 17:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:22 compute-1 ceph-mon[75484]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:22.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:23 compute-1 sudo[115873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hriiqfjpcovchjxhznxjhreuhuvmplkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254382.6065655-80-114882699487320/AnsiballZ_file.py'
Sep 30 17:46:23 compute-1 sudo[115873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:23.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:23 compute-1 python3.9[115875]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:23 compute-1 sudo[115873]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:23 compute-1 sudo[116027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmzldaarlbclmjkxjmekpbwktkwwbyir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254383.4648411-80-192534408171918/AnsiballZ_file.py'
Sep 30 17:46:23 compute-1 sudo[116027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:23 compute-1 python3.9[116029]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:24 compute-1 sudo[116027]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:24 compute-1 sshd-session[115876]: Invalid user sftp from 175.126.165.170 port 40422
Sep 30 17:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:24 compute-1 sshd-session[115876]: Received disconnect from 175.126.165.170 port 40422:11: Bye Bye [preauth]
Sep 30 17:46:24 compute-1 sshd-session[115876]: Disconnected from invalid user sftp 175.126.165.170 port 40422 [preauth]
Sep 30 17:46:24 compute-1 ceph-mon[75484]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:24 compute-1 sudo[116180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oakluqqcesgjnfohtjsqjdrxvxaaaoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254384.200663-113-97350044921696/AnsiballZ_stat.py'
Sep 30 17:46:24 compute-1 sudo[116180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.728488) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384728546, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1371, "num_deletes": 251, "total_data_size": 3226286, "memory_usage": 3290840, "flush_reason": "Manual Compaction"}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384740609, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1278545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10822, "largest_seqno": 12188, "table_properties": {"data_size": 1274113, "index_size": 1892, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11377, "raw_average_key_size": 19, "raw_value_size": 1264509, "raw_average_value_size": 2218, "num_data_blocks": 86, "num_entries": 570, "num_filter_entries": 570, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254264, "oldest_key_time": 1759254264, "file_creation_time": 1759254384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 12294 microseconds, and 6944 cpu microseconds.
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.740783) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1278545 bytes OK
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.740859) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.742922) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.742945) EVENT_LOG_v1 {"time_micros": 1759254384742938, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.742972) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3219846, prev total WAL file size 3219846, number of live WAL files 2.
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.744860) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1248KB)], [21(11MB)]
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384744952, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 13503995, "oldest_snapshot_seqno": -1}
Sep 30 17:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:24 compute-1 python3.9[116183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:24 compute-1 sudo[116180]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3979 keys, 11192021 bytes, temperature: kUnknown
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384834364, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 11192021, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11162227, "index_size": 18740, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9989, "raw_key_size": 100895, "raw_average_key_size": 25, "raw_value_size": 11086356, "raw_average_value_size": 2786, "num_data_blocks": 806, "num_entries": 3979, "num_filter_entries": 3979, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.834591) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 11192021 bytes
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.836141) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.9 rd, 125.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 11.7 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(19.3) write-amplify(8.8) OK, records in: 4441, records dropped: 462 output_compression: NoCompression
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.836161) EVENT_LOG_v1 {"time_micros": 1759254384836150, "job": 10, "event": "compaction_finished", "compaction_time_micros": 89471, "compaction_time_cpu_micros": 27669, "output_level": 6, "num_output_files": 1, "total_output_size": 11192021, "num_input_records": 4441, "num_output_records": 3979, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384836566, "job": 10, "event": "table_file_deletion", "file_number": 23}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254384839512, "job": 10, "event": "table_file_deletion", "file_number": 21}
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.744723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.839606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.839643) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.839647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.839650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:46:24.839652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:46:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:24.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:25.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:25 compute-1 sudo[116306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnjfhzyubhynrbzakkywrgvgxhptbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254384.200663-113-97350044921696/AnsiballZ_copy.py'
Sep 30 17:46:25 compute-1 sshd-session[116231]: Invalid user solana from 45.148.10.240 port 44134
Sep 30 17:46:25 compute-1 sudo[116306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:25 compute-1 sshd-session[116231]: Connection closed by invalid user solana 45.148.10.240 port 44134 [preauth]
Sep 30 17:46:25 compute-1 python3.9[116308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254384.200663-113-97350044921696/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=f58d36f969171aa06d2e974547d0fc4b46cd2d65 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:25 compute-1 sudo[116306]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:26 compute-1 sudo[116459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmrgtkyiximwivwdhyecurdpbcdscgwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254385.7416108-113-26275952368929/AnsiballZ_stat.py'
Sep 30 17:46:26 compute-1 sudo[116459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:26 compute-1 python3.9[116461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:26 compute-1 sudo[116459]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:26 compute-1 sudo[116583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viykjisqfqnwwvwzogdaadzpfaywsnyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254385.7416108-113-26275952368929/AnsiballZ_copy.py'
Sep 30 17:46:26 compute-1 sudo[116583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:26 compute-1 ceph-mon[75484]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:26.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:26 compute-1 python3.9[116585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254385.7416108-113-26275952368929/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=14ca920c1050ec666e37d5428df0a6816cc0fde3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:26 compute-1 sudo[116583]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:27.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:27 compute-1 sudo[116735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aydphduwzgbigukqlivczksqlhxcbqfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254387.0882533-113-184260108003632/AnsiballZ_stat.py'
Sep 30 17:46:27 compute-1 sudo[116735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:27 compute-1 python3.9[116737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:27 compute-1 sudo[116735]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:28 compute-1 sudo[116859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehzysoieziratnnmaphuqwmwfmpmthgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254387.0882533-113-184260108003632/AnsiballZ_copy.py'
Sep 30 17:46:28 compute-1 sudo[116859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:28 compute-1 python3.9[116861]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254387.0882533-113-184260108003632/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=68a94a731381a61457abca4ccb0aaca370b10c6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:28 compute-1 sudo[116859]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:28 compute-1 ceph-mon[75484]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:28.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:29 compute-1 sudo[117012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwpuzbpoipomjofetucufbycftzoikwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254388.6945045-207-95480836225779/AnsiballZ_file.py'
Sep 30 17:46:29 compute-1 sudo[117012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:29.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:29 compute-1 python3.9[117014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:29 compute-1 sudo[117012]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:29 compute-1 sudo[117164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgdxgdnowiojyuxghbojxjmahrcjwxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254389.571692-207-60311650314954/AnsiballZ_file.py'
Sep 30 17:46:29 compute-1 sudo[117164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:30 compute-1 python3.9[117166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:30 compute-1 sudo[117164]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:30 compute-1 ceph-mon[75484]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:30 compute-1 sudo[117318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dptcqabljplwsqchdltfasdeiolsqtds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254390.3538852-242-14258140583919/AnsiballZ_stat.py'
Sep 30 17:46:30 compute-1 sudo[117318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:30.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:31 compute-1 python3.9[117320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:31 compute-1 sudo[117318]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:31.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:31 compute-1 sudo[117441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydaonvmglsrucxcoyomozpxunsaefiou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254390.3538852-242-14258140583919/AnsiballZ_copy.py'
Sep 30 17:46:31 compute-1 sudo[117441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:31 compute-1 python3.9[117443]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254390.3538852-242-14258140583919/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4784ea04cf4cee6a225d4094ed4408fab1726245 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:31 compute-1 sudo[117441]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:32 compute-1 sudo[117594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzavndqimgzeansmylxxeydaithkchvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254391.9002352-242-171163622353169/AnsiballZ_stat.py'
Sep 30 17:46:32 compute-1 sudo[117594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:32 compute-1 python3.9[117596]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:32 compute-1 sudo[117594]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:32 compute-1 ceph-mon[75484]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:32 compute-1 sudo[117718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uamtthqbomubqgcetwsdqsnphtcqvaep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254391.9002352-242-171163622353169/AnsiballZ_copy.py'
Sep 30 17:46:32 compute-1 sudo[117718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:32.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:33 compute-1 python3.9[117720]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254391.9002352-242-171163622353169/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b80e9b9f8c9d7ff05236c445fef88812ffce8335 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:33 compute-1 sudo[117718]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:33.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:33 compute-1 sudo[117870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xavwnbofvcuoexokzgyiplaknpmisvvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254393.2055168-242-37139325800816/AnsiballZ_stat.py'
Sep 30 17:46:33 compute-1 sudo[117870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:33 compute-1 python3.9[117872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:33 compute-1 sudo[117870]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:34 compute-1 sudo[117994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwrmtrvczfhdbkrzlkzqgldohqqoqen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254393.2055168-242-37139325800816/AnsiballZ_copy.py'
Sep 30 17:46:34 compute-1 sudo[117994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:34 compute-1 python3.9[117996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254393.2055168-242-37139325800816/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=5b7e6e0f922e3ad129882fc49f77224bc7d11c73 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:34 compute-1 sudo[117994]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:34 compute-1 ceph-mon[75484]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:34.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:34 compute-1 sudo[118147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmoojgzeswhwhszzgmmtbhbwemrshisx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254394.610837-332-157765281137968/AnsiballZ_file.py'
Sep 30 17:46:34 compute-1 sudo[118147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:35 compute-1 python3.9[118149]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:35.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:35 compute-1 sudo[118147]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:35 compute-1 sudo[118299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujicqtlqfjubbegcvmdwjtstkayxnjea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254395.3136063-332-27791130748814/AnsiballZ_file.py'
Sep 30 17:46:35 compute-1 sudo[118299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:35 compute-1 ceph-mon[75484]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:35 compute-1 python3.9[118301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:35 compute-1 sudo[118299]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:36 compute-1 sudo[118452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eadeaucllnrbbvulklhyegdmywlbouxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254396.1106312-367-112508432748499/AnsiballZ_stat.py'
Sep 30 17:46:36 compute-1 sudo[118452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8002970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:36 compute-1 python3.9[118455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:36 compute-1 sudo[118452]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:36.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:37.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:37 compute-1 sudo[118576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rffsbmmqewcsoiqfvsgmgfgbjaarkjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254396.1106312-367-112508432748499/AnsiballZ_copy.py'
Sep 30 17:46:37 compute-1 sudo[118576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:46:37 compute-1 python3.9[118578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254396.1106312-367-112508432748499/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=8648c25867592171d5334429698ae72181f640d8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:37 compute-1 sudo[118576]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:37 compute-1 sudo[118729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlvqenrdpzzqvisnxfnivrjjdveoifdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254397.6177778-367-188107923734923/AnsiballZ_stat.py'
Sep 30 17:46:37 compute-1 sudo[118729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:38 compute-1 python3.9[118731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:38 compute-1 sudo[118729]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:38 compute-1 ceph-mon[75484]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:38 compute-1 sudo[118853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiugcgzlvzmiwpgihhkheaiqnixnrqbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254397.6177778-367-188107923734923/AnsiballZ_copy.py'
Sep 30 17:46:38 compute-1 sudo[118853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00040b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:38 compute-1 python3.9[118855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254397.6177778-367-188107923734923/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=948f0dfed0e9186afecce803b60afdaad42b0122 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:38 compute-1 sudo[118853]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:38.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:38 compute-1 sshd-session[118856]: Invalid user habib from 107.172.146.104 port 50428
Sep 30 17:46:38 compute-1 sshd-session[118856]: Received disconnect from 107.172.146.104 port 50428:11: Bye Bye [preauth]
Sep 30 17:46:38 compute-1 sshd-session[118856]: Disconnected from invalid user habib 107.172.146.104 port 50428 [preauth]
Sep 30 17:46:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:39.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:39 compute-1 sudo[119007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnouwuwgthhsxsenajapyazmyxdzhepf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254399.027898-367-201727361004713/AnsiballZ_stat.py'
Sep 30 17:46:39 compute-1 sudo[119007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:39 compute-1 python3.9[119009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:39 compute-1 sudo[119007]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:40 compute-1 sudo[119131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzqenibyeaccenwvoeomohugaehgkox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254399.027898-367-201727361004713/AnsiballZ_copy.py'
Sep 30 17:46:40 compute-1 sudo[119131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:40 compute-1 python3.9[119133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254399.027898-367-201727361004713/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=8aaaa71bb5fd1202cdcde1d9a44a6b49cea5ba12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:40 compute-1 sudo[119131]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:40 compute-1 ceph-mon[75484]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:40.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:40 compute-1 sudo[119258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:46:40 compute-1 sudo[119258]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:46:40 compute-1 sudo[119258]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:40 compute-1 sudo[119309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyatcgmcrgypauhrhtlwstmglvqndggw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254400.5447817-456-280687853154376/AnsiballZ_file.py'
Sep 30 17:46:40 compute-1 sudo[119309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:41.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:41 compute-1 python3.9[119311]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:41 compute-1 sudo[119309]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:41 compute-1 sudo[119461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jndvxhhafrryukmedtnftdkxngzpkjqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254401.4006784-456-24985843305089/AnsiballZ_file.py'
Sep 30 17:46:41 compute-1 sudo[119461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:41 compute-1 python3.9[119463]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:42 compute-1 sudo[119461]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:42 compute-1 ceph-mon[75484]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:42 compute-1 sudo[119614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibuhnzvolfphlzthsjklhzxvgwiagado ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254402.2124355-492-225186229271822/AnsiballZ_stat.py'
Sep 30 17:46:42 compute-1 sudo[119614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:42 compute-1 python3.9[119617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:42 compute-1 sudo[119614]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:42.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:43.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:43 compute-1 sudo[119738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxrfaxfhbrilzwuksglxpqtyovhyowkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254402.2124355-492-225186229271822/AnsiballZ_copy.py'
Sep 30 17:46:43 compute-1 sudo[119738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:43 compute-1 python3.9[119740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254402.2124355-492-225186229271822/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=3520c130e167178a70bfad0751aacc853863da98 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:43 compute-1 sudo[119738]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:43 compute-1 sudo[119890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blnqbeljhoccwhvahzltuphocldmejsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254403.5785398-492-217132328557074/AnsiballZ_stat.py'
Sep 30 17:46:43 compute-1 sudo[119890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:44 compute-1 python3.9[119892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:44 compute-1 sudo[119890]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:44 compute-1 sudo[120014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkagtvgbfpdbgmpbpadnefixobsyvycy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254403.5785398-492-217132328557074/AnsiballZ_copy.py'
Sep 30 17:46:44 compute-1 sudo[120014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:44 compute-1 ceph-mon[75484]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:44 compute-1 python3.9[120016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254403.5785398-492-217132328557074/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=948f0dfed0e9186afecce803b60afdaad42b0122 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:44 compute-1 sudo[120014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8003cc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:44.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:45 compute-1 sudo[120168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsrkruzgaqfdnsphdqapugblkkjmhukd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254404.848246-492-258666198533044/AnsiballZ_stat.py'
Sep 30 17:46:45 compute-1 sudo[120168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:45.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:45 compute-1 python3.9[120170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:45 compute-1 sudo[120168]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:45 compute-1 sudo[120291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-holuvmiqresdkukllougyizqksplcklz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254404.848246-492-258666198533044/AnsiballZ_copy.py'
Sep 30 17:46:45 compute-1 sudo[120291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174645 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:46:45 compute-1 python3.9[120293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254404.848246-492-258666198533044/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9847ecf2ded7b27ff303b9250f84fd642d20a5a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:45 compute-1 sudo[120291]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:46 compute-1 ceph-mon[75484]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:46.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:47 compute-1 sudo[120446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drcmekjgrcbqjvmxkyignhzvlwdmzity ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254406.7746475-615-18863617806585/AnsiballZ_file.py'
Sep 30 17:46:47 compute-1 sudo[120446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:47.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:47 compute-1 python3.9[120448]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:47 compute-1 sudo[120446]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:47 compute-1 sudo[120598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urfgjygbnrcopeyxupnzbszjbgbnptlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254407.5155768-633-186536128489566/AnsiballZ_stat.py'
Sep 30 17:46:47 compute-1 sudo[120598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:48 compute-1 python3.9[120600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:48 compute-1 sudo[120598]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:48 compute-1 ceph-mon[75484]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:48 compute-1 sudo[120723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndhppnfspnmgsubragvszvotvacoxsjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254407.5155768-633-186536128489566/AnsiballZ_copy.py'
Sep 30 17:46:48 compute-1 sudo[120723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200002670 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:48 compute-1 python3.9[120725]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254407.5155768-633-186536128489566/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:48 compute-1 sudo[120723]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:48.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:49.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:49 compute-1 sudo[120875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voyvjosfxyjrrgxchhvadpoawrcyoren ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254409.1314683-671-126166103766318/AnsiballZ_file.py'
Sep 30 17:46:49 compute-1 sudo[120875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:49 compute-1 python3.9[120877]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:49 compute-1 sudo[120875]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:50 compute-1 sudo[121028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybdhhzsrqyxoatsjjvloiprjkzegaeks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254409.9655223-690-109163219243649/AnsiballZ_stat.py'
Sep 30 17:46:50 compute-1 sudo[121028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:50 compute-1 ceph-mon[75484]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:50 compute-1 python3.9[121030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:50 compute-1 sudo[121028]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:51.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:51 compute-1 sudo[121152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxfdwwgwfesukrwrvlnvlbpnibwkzexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254409.9655223-690-109163219243649/AnsiballZ_copy.py'
Sep 30 17:46:51 compute-1 sudo[121152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:51 compute-1 python3.9[121154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254409.9655223-690-109163219243649/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:51 compute-1 sudo[121152]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:52 compute-1 sudo[121305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utwuzfchbzzekcexckuaoriqinjsquom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254411.7667205-726-20160504285506/AnsiballZ_file.py'
Sep 30 17:46:52 compute-1 sudo[121305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:52 compute-1 python3.9[121307]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:52 compute-1 sudo[121305]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:52 compute-1 ceph-mon[75484]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:46:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:52 compute-1 sudo[121458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tadszmmxtppmtkhdljffpuisngxdklhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254412.5028121-743-6630385548018/AnsiballZ_stat.py'
Sep 30 17:46:52 compute-1 sudo[121458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:46:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:52.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:46:53 compute-1 python3.9[121460]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:53 compute-1 sudo[121458]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:53.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:53 compute-1 sudo[121581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbufzozadrneizjlhkinluxbfbogsrti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254412.5028121-743-6630385548018/AnsiballZ_copy.py'
Sep 30 17:46:53 compute-1 sudo[121581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:53 compute-1 python3.9[121583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254412.5028121-743-6630385548018/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:53 compute-1 sudo[121581]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:54 compute-1 sudo[121734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfbtzcthpmmsudtfwmwmpngbewhkgjst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254413.9317079-779-159727301115139/AnsiballZ_file.py'
Sep 30 17:46:54 compute-1 sudo[121734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:54 compute-1 python3.9[121736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:54 compute-1 sudo[121734]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:54 compute-1 ceph-mon[75484]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8001080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:54 compute-1 sudo[121806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:46:54 compute-1 sudo[121806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:46:54 compute-1 sudo[121806]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:54.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:54 compute-1 sudo[121839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:46:55 compute-1 sudo[121839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:46:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:55.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:55 compute-1 sudo[121937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orgxmbmxqzvyuwkuuswwmpcudwwrfhrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254414.7493505-799-149850523601030/AnsiballZ_stat.py'
Sep 30 17:46:55 compute-1 sudo[121937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:55 compute-1 python3.9[121941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:55 compute-1 sudo[121937]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:55 compute-1 sudo[121839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:55 compute-1 sudo[122093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naegnxxgpcrednppxlfuwitcjrewrnlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254414.7493505-799-149850523601030/AnsiballZ_copy.py'
Sep 30 17:46:55 compute-1 sudo[122093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:56 compute-1 python3.9[122095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254414.7493505-799-149850523601030/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:56 compute-1 sudo[122093]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:56 compute-1 ceph-mon[75484]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:46:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:46:56 compute-1 sudo[122247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hquujxwgkukjobluetpqyorfhdiplsso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254416.2965748-831-263055945285958/AnsiballZ_file.py'
Sep 30 17:46:56 compute-1 sudo[122247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:56 compute-1 python3.9[122249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:56 compute-1 sudo[122247]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:46:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:56.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:46:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:57.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:57 compute-1 sudo[122399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qloqcypwksemvsyoitpwstnphfiyhkvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254417.1019897-847-16972905135769/AnsiballZ_stat.py'
Sep 30 17:46:57 compute-1 sudo[122399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:57 compute-1 python3.9[122401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:46:57 compute-1 sudo[122399]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:57 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:57 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:46:58 compute-1 sudo[122523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lshqpouihicpfbbshwpzekbfynpxpbbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254417.1019897-847-16972905135769/AnsiballZ_copy.py'
Sep 30 17:46:58 compute-1 sudo[122523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:58 compute-1 python3.9[122525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254417.1019897-847-16972905135769/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:46:58 compute-1 sudo[122523]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:58 compute-1 ceph-mon[75484]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:46:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:46:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:46:58.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:58 compute-1 sudo[122676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isixnfgnjvphxapexpszqcvxkgxlbrkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254418.575106-871-53646460858055/AnsiballZ_file.py'
Sep 30 17:46:58 compute-1 sudo[122676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:46:59 compute-1 python3.9[122678]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:46:59 compute-1 sudo[122676]: pam_unix(sudo:session): session closed for user root
Sep 30 17:46:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:46:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:46:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:46:59.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:46:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:46:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:46:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:46:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:46:59 compute-1 sudo[122828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbripmtxnahhxparkkhkijvcncadnbpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254419.4037633-879-42985297803028/AnsiballZ_stat.py'
Sep 30 17:46:59 compute-1 sudo[122828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:00 compute-1 python3.9[122830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:00 compute-1 sudo[122828]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:00 compute-1 sudo[122952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavmlgkvnraexywcrzrnniwmqqytgohn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254419.4037633-879-42985297803028/AnsiballZ_copy.py'
Sep 30 17:47:00 compute-1 sudo[122952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:00 compute-1 ceph-mon[75484]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:00 compute-1 python3.9[122954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254419.4037633-879-42985297803028/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:00 compute-1 sudo[122952]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:00 compute-1 sudo[122980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:47:00 compute-1 sudo[122980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:47:00 compute-1 sudo[122980]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:01 compute-1 sudo[123005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:47:01 compute-1 sudo[123005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:47:01 compute-1 sudo[123005]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:01.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:01 compute-1 sudo[123155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eureqiaewoiuieescyqvdasyacxijweg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254421.0153327-895-92685602146408/AnsiballZ_file.py'
Sep 30 17:47:01 compute-1 sudo[123155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:01 compute-1 python3.9[123157]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:47:01 compute-1 sudo[123155]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:01 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:47:01 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:02 compute-1 sudo[123308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twqsozrhevsntigbkhtlbvhdjvqpgocm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254421.8786736-903-269238638209310/AnsiballZ_stat.py'
Sep 30 17:47:02 compute-1 sudo[123308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:02 compute-1 python3.9[123310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:02 compute-1 sudo[123308]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:02 compute-1 ceph-mon[75484]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:02 compute-1 sudo[123432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlweyudruwdxrxmpwumbjygotmsbituw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254421.8786736-903-269238638209310/AnsiballZ_copy.py'
Sep 30 17:47:02 compute-1 sudo[123432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:03.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:03 compute-1 python3.9[123434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254421.8786736-903-269238638209310/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fa443739ff2ff1b18352a001fa075b3190ad3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:03 compute-1 sudo[123432]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:03 compute-1 sshd-session[115539]: Connection closed by 192.168.122.30 port 48626
Sep 30 17:47:03 compute-1 sshd-session[115536]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:47:03 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Sep 30 17:47:03 compute-1 systemd[1]: session-48.scope: Consumed 36.408s CPU time.
Sep 30 17:47:03 compute-1 systemd-logind[789]: Session 48 logged out. Waiting for processes to exit.
Sep 30 17:47:03 compute-1 systemd-logind[789]: Removed session 48.
Sep 30 17:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:04 compute-1 ceph-mon[75484]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:04.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:05.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003f40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:06 compute-1 ceph-mon[75484]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:06.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:07.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:47:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174707 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200003270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:08.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:08 compute-1 ceph-mon[75484]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:47:09 compute-1 sshd-session[123466]: Accepted publickey for zuul from 192.168.122.30 port 36636 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:47:09 compute-1 systemd-logind[789]: New session 49 of user zuul.
Sep 30 17:47:09 compute-1 systemd[1]: Started Session 49 of User zuul.
Sep 30 17:47:09 compute-1 sshd-session[123466]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:47:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:09.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:09 compute-1 sudo[123619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztkvxtrpfrxhncxvttiuuszcznqpahik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254429.2011771-25-254789983605501/AnsiballZ_file.py'
Sep 30 17:47:09 compute-1 sudo[123619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:09 compute-1 ceph-mon[75484]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:09 compute-1 python3.9[123621]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:09 compute-1 sudo[123619]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:10 compute-1 sudo[123773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjevqznkfxasoujuunzbttwbivxstioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254430.176024-49-45057441694974/AnsiballZ_stat.py'
Sep 30 17:47:10 compute-1 sudo[123773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003f40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:10 compute-1 python3.9[123775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:10 compute-1 sudo[123773]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:11.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:11 compute-1 sudo[123896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxujfxjytlypuahrdfmqjametibtljll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254430.176024-49-45057441694974/AnsiballZ_copy.py'
Sep 30 17:47:11 compute-1 sudo[123896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:11 compute-1 python3.9[123898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254430.176024-49-45057441694974/.source.conf _original_basename=ceph.conf follow=False checksum=e66796ec907df6d0d5e4b75f31c3de3e776363a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:11 compute-1 sudo[123896]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:12 compute-1 sudo[124049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maavdgyxmthqovugkpauryprbsxogmyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254431.793796-49-218470513451999/AnsiballZ_stat.py'
Sep 30 17:47:12 compute-1 sudo[124049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:12 compute-1 ceph-mon[75484]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:12 compute-1 python3.9[124051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:12 compute-1 sudo[124049]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200003270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:12 compute-1 sudo[124173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyqenbhegmodjwrkdwimmskfczmxrivc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254431.793796-49-218470513451999/AnsiballZ_copy.py'
Sep 30 17:47:12 compute-1 sudo[124173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:12.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:13 compute-1 python3.9[124175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254431.793796-49-218470513451999/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=7362c4454e8786984d45f5a884c5c867d1ac96a9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:13 compute-1 sudo[124173]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:13.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:13 compute-1 sshd-session[123469]: Connection closed by 192.168.122.30 port 36636
Sep 30 17:47:13 compute-1 sshd-session[123466]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:47:13 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Sep 30 17:47:13 compute-1 systemd[1]: session-49.scope: Consumed 3.490s CPU time.
Sep 30 17:47:13 compute-1 systemd-logind[789]: Session 49 logged out. Waiting for processes to exit.
Sep 30 17:47:13 compute-1 systemd-logind[789]: Removed session 49.
Sep 30 17:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:14 compute-1 ceph-mon[75484]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:14.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:15.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:16 compute-1 ceph-mon[75484]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200003270 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:16.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:17.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:18 compute-1 ceph-mon[75484]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0041d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:18.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:19.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:19 compute-1 sshd-session[124208]: Accepted publickey for zuul from 192.168.122.30 port 44804 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:47:19 compute-1 systemd-logind[789]: New session 50 of user zuul.
Sep 30 17:47:19 compute-1 systemd[1]: Started Session 50 of User zuul.
Sep 30 17:47:19 compute-1 sshd-session[124208]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:20 compute-1 ceph-mon[75484]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:47:20 compute-1 python3.9[124364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:47:20 compute-1 sshd-session[124264]: Invalid user tobias from 84.51.43.58 port 36269
Sep 30 17:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0041d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:20 compute-1 sshd-session[124264]: Received disconnect from 84.51.43.58 port 36269:11: Bye Bye [preauth]
Sep 30 17:47:20 compute-1 sshd-session[124264]: Disconnected from invalid user tobias 84.51.43.58 port 36269 [preauth]
Sep 30 17:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:20.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:21 compute-1 sudo[124417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:47:21 compute-1 sudo[124417]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:47:21 compute-1 sudo[124417]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:21 compute-1 sudo[124544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygbfijqtfdvddnpccftcjigsunugkit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254441.0739026-49-169913699413367/AnsiballZ_file.py'
Sep 30 17:47:21 compute-1 sudo[124544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:21 compute-1 python3.9[124546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:47:21 compute-1 sudo[124544]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:22 compute-1 sudo[124697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqpxjplbjzhpvebvsxdlryiwcgpoycgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254442.0341244-49-226930003278385/AnsiballZ_file.py'
Sep 30 17:47:22 compute-1 sudo[124697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:22 compute-1 ceph-mon[75484]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:47:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:47:22 compute-1 python3.9[124699]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:47:22 compute-1 sudo[124697]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00014d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:22.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:23.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:23 compute-1 python3.9[124850]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:47:24 compute-1 sudo[125003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvyefxbuztcsgjiuhfjfewcmhjxiqufn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254443.5903714-95-135334530053915/AnsiballZ_seboolean.py'
Sep 30 17:47:24 compute-1 sudo[125003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:24 compute-1 python3.9[125005]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 17:47:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:24 compute-1 sshd-session[124856]: Invalid user claudiu from 194.107.115.65 port 25854
Sep 30 17:47:24 compute-1 sshd-session[124856]: Received disconnect from 194.107.115.65 port 25854:11: Bye Bye [preauth]
Sep 30 17:47:24 compute-1 sshd-session[124856]: Disconnected from invalid user claudiu 194.107.115.65 port 25854 [preauth]
Sep 30 17:47:24 compute-1 ceph-mon[75484]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0041f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8001e00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:24.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:25.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174725 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:47:26 compute-1 ceph-mon[75484]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:26 compute-1 sudo[125003]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00014d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:26.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:27 compute-1 sudo[125162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmqqlgpjzlfnobhezipdkssbzuelvpwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254446.6606092-115-142840803603263/AnsiballZ_setup.py'
Sep 30 17:47:27 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Sep 30 17:47:27 compute-1 sudo[125162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:27.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:27 compute-1 python3.9[125164]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:47:27 compute-1 sudo[125162]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:27 compute-1 sudo[125247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpndjxwsoswecubdrnimdaofyxyjytap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254446.6606092-115-142840803603263/AnsiballZ_dnf.py'
Sep 30 17:47:27 compute-1 sudo[125247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:28 compute-1 python3.9[125249]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:28 compute-1 ceph-mon[75484]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004210 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e8001fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:29.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:29 compute-1 sudo[125247]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:30 compute-1 sudo[125402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itxrhxvrkxptogidtvjfrxpahajvkxpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254449.8804164-139-255394079828338/AnsiballZ_systemd.py'
Sep 30 17:47:30 compute-1 sudo[125402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:30 compute-1 ceph-mon[75484]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:30 compute-1 python3.9[125404]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:47:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:31.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:31 compute-1 sudo[125402]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:31.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:31 compute-1 sudo[125558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iucortiqcssuczuukreiuoiorxvmtdoy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254451.2437325-155-211936512967791/AnsiballZ_edpm_nftables_snippet.py'
Sep 30 17:47:31 compute-1 sudo[125558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:31 compute-1 python3[125560]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Sep 30 17:47:31 compute-1 sudo[125558]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:32 compute-1 sudo[125712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdlfziuvinmyfimpqxzmiesozngrskjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254452.247798-173-233170186989506/AnsiballZ_file.py'
Sep 30 17:47:32 compute-1 sudo[125712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:32 compute-1 ceph-mon[75484]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:47:32 compute-1 python3.9[125714]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:32 compute-1 sudo[125712]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:33.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:33.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:33 compute-1 sudo[125864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjofzakxafvwfedlbbjydfbmikghwhdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254453.0929298-189-152318725062386/AnsiballZ_stat.py'
Sep 30 17:47:33 compute-1 sudo[125864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:33 compute-1 python3.9[125866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:33 compute-1 sudo[125864]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:34 compute-1 sudo[125943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcskcazpeypcmdhwslcwctuuyssptjjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254453.0929298-189-152318725062386/AnsiballZ_file.py'
Sep 30 17:47:34 compute-1 sudo[125943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:34 compute-1 python3.9[125945]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:34 compute-1 sudo[125943]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:47:34 compute-1 sudo[126096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqncvvhywaeetqhvttojniqrkdkymcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254454.5505075-213-15333419072902/AnsiballZ_stat.py'
Sep 30 17:47:34 compute-1 sudo[126096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:34 compute-1 ceph-mon[75484]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00021f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:35.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:35 compute-1 python3.9[126098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:35 compute-1 sudo[126096]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:35.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:35 compute-1 sudo[126174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czpbwgwjbxoijpcvgdfxzuzpnufkfmpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254454.5505075-213-15333419072902/AnsiballZ_file.py'
Sep 30 17:47:35 compute-1 sudo[126174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:35 compute-1 python3.9[126176]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.k80qddj1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:35 compute-1 sudo[126174]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:35 compute-1 ceph-mon[75484]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:36 compute-1 sudo[126327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgvrkkqxtsuddlkvysgfezxvqaeziqpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254455.8693225-237-41624449778235/AnsiballZ_stat.py'
Sep 30 17:47:36 compute-1 sudo[126327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:36 compute-1 python3.9[126329]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:36 compute-1 sudo[126327]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:36 compute-1 sudo[126406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqcwfcpedrttattaryycrwkkmswydsqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254455.8693225-237-41624449778235/AnsiballZ_file.py'
Sep 30 17:47:36 compute-1 sudo[126406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004250 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:36 compute-1 python3.9[126408]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:36 compute-1 sudo[126406]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:37.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:37.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:47:37 compute-1 sudo[126558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzufhamqvcrzybzpqawzdtgqmaqdysdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254457.1893885-263-115289926359421/AnsiballZ_command.py'
Sep 30 17:47:37 compute-1 sudo[126558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:37 compute-1 python3.9[126560]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:37 compute-1 sudo[126558]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:37 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:37 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:38 compute-1 sshd[1007]: Timeout before authentication for connection from 14.103.129.43 to 38.102.83.102, pid = 111978
Sep 30 17:47:38 compute-1 sudo[126712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnhhwjhhhtpxjwkfygebpaeszejtwlnj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254458.0528312-279-13395038434911/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 17:47:38 compute-1 sudo[126712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:38 compute-1 ceph-mon[75484]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:38 compute-1 sshd-session[126715]: Received disconnect from 107.172.146.104 port 33478:11: Bye Bye [preauth]
Sep 30 17:47:38 compute-1 sshd-session[126715]: Disconnected from authenticating user root 107.172.146.104 port 33478 [preauth]
Sep 30 17:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:38 compute-1 python3[126714]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 17:47:38 compute-1 sudo[126712]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00021f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:39.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:39 compute-1 sudo[126867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gssekxmmhkhqjrtoekbluylemmmdzhbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254458.9932673-295-88312410115614/AnsiballZ_stat.py'
Sep 30 17:47:39 compute-1 sudo[126867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:39 compute-1 python3.9[126869]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:39 compute-1 sudo[126867]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:40 compute-1 sudo[126993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxfutlysyxbbxdiqzpvzrigchziavpfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254458.9932673-295-88312410115614/AnsiballZ_copy.py'
Sep 30 17:47:40 compute-1 sudo[126993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:40 compute-1 python3.9[126995]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254458.9932673-295-88312410115614/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:40 compute-1 sudo[126993]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:40 compute-1 sshd[1007]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 111980
Sep 30 17:47:40 compute-1 ceph-mon[75484]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004270 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:41 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:47:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:41.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:41 compute-1 sudo[127146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsxyouyswwffjkfkletjolkhdbjvotfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254460.7278364-325-215211203842817/AnsiballZ_stat.py'
Sep 30 17:47:41 compute-1 sudo[127146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:41 compute-1 sudo[127149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:47:41 compute-1 sudo[127149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:47:41 compute-1 sudo[127149]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:41.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:41 compute-1 python3.9[127148]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:41 compute-1 sudo[127146]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:41 compute-1 sudo[127296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnkpjwjdssjsixvkeqjvkhlsqalmlho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254460.7278364-325-215211203842817/AnsiballZ_copy.py'
Sep 30 17:47:41 compute-1 sudo[127296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:42 compute-1 python3.9[127298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254460.7278364-325-215211203842817/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:42 compute-1 sudo[127296]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:42 compute-1 sudo[127450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmrqvnnjhmpjsnqyaxpmvmirjdsbrtak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254462.3173788-355-211590570591429/AnsiballZ_stat.py'
Sep 30 17:47:42 compute-1 sudo[127450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:42 compute-1 ceph-mon[75484]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:42 compute-1 python3.9[127452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:42 compute-1 sudo[127450]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:43.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:43.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:43 compute-1 sudo[127575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzgaabomjusxnbcuddhyqmqaplxpgyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254462.3173788-355-211590570591429/AnsiballZ_copy.py'
Sep 30 17:47:43 compute-1 sudo[127575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:43 compute-1 python3.9[127577]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254462.3173788-355-211590570591429/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:43 compute-1 sudo[127575]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:44 compute-1 sudo[127728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knhifetkqhzfzbenzbadrjbdaiacdzrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254463.9250095-385-267543861777128/AnsiballZ_stat.py'
Sep 30 17:47:44 compute-1 sudo[127728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:44 compute-1 python3.9[127730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:44 compute-1 sudo[127728]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:44 compute-1 ceph-mon[75484]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:44 compute-1 sudo[127856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sktyrvsgmoxrjbyojzrtvsyootihgscw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254463.9250095-385-267543861777128/AnsiballZ_copy.py'
Sep 30 17:47:44 compute-1 sudo[127856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:45.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:45 compute-1 python3.9[127858]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254463.9250095-385-267543861777128/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:45 compute-1 sudo[127856]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:45 compute-1 sshd-session[127781]: Invalid user admin from 167.172.43.167 port 57824
Sep 30 17:47:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:45.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:45 compute-1 sshd-session[127781]: Received disconnect from 167.172.43.167 port 57824:11: Bye Bye [preauth]
Sep 30 17:47:45 compute-1 sshd-session[127781]: Disconnected from invalid user admin 167.172.43.167 port 57824 [preauth]
Sep 30 17:47:45 compute-1 sudo[128008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiyhgpqyxenutotejogqvbufhwhptgqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254465.385135-415-157159411759431/AnsiballZ_stat.py'
Sep 30 17:47:45 compute-1 sudo[128008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:45 compute-1 ceph-mon[75484]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:47:46 compute-1 python3.9[128010]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:46 compute-1 sudo[128008]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:46 compute-1 sudo[128135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvbqgdkhucwrxhrflkzrrxeyfsnkbaub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254465.385135-415-157159411759431/AnsiballZ_copy.py'
Sep 30 17:47:46 compute-1 sudo[128135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e00021f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:47.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:47 compute-1 python3.9[128137]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254465.385135-415-157159411759431/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:47 compute-1 sudo[128135]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:47.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:47 compute-1 sudo[128287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjsfxuzunaiypultcpbmrctykfsuhdsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254467.325137-445-208929240106731/AnsiballZ_file.py'
Sep 30 17:47:47 compute-1 sudo[128287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174747 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:47:47 compute-1 python3.9[128289]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:47 compute-1 sudo[128287]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:48 compute-1 sudo[128440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqvupdwtmhiasfvhwcslsfhcsdmagunq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254468.0927608-461-109208126278162/AnsiballZ_command.py'
Sep 30 17:47:48 compute-1 sudo[128440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:48 compute-1 ceph-mon[75484]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:47:48 compute-1 python3.9[128442]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:48 compute-1 sudo[128440]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc0042b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:49.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:49.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:49 compute-1 sudo[128597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsibtyoisrgzufqlvwaacesknwotyzwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254468.9725163-477-78834798391107/AnsiballZ_blockinfile.py'
Sep 30 17:47:49 compute-1 sudo[128597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:49 compute-1 python3.9[128599]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:49 compute-1 sudo[128597]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:50 compute-1 sudo[128750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brmzrzxojfjymislfzxmawrtmvzdtbgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254470.0033946-495-166442173554360/AnsiballZ_command.py'
Sep 30 17:47:50 compute-1 sudo[128750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:50 compute-1 ceph-mon[75484]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:47:50 compute-1 python3.9[128752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:50 compute-1 sudo[128750]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:51.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:51 compute-1 sudo[128905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwbnryfrwdtsmfefsaswruzxfpvgbrxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254470.841811-511-262577420092753/AnsiballZ_stat.py'
Sep 30 17:47:51 compute-1 sudo[128905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:51.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:51 compute-1 python3.9[128907]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:47:51 compute-1 sudo[128905]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:51 compute-1 sudo[129060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekbwmmrewduybzclnaxhnwydysngostu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254471.6357656-527-4693662632607/AnsiballZ_command.py'
Sep 30 17:47:51 compute-1 sudo[129060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:52 compute-1 python3.9[129062]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:52 compute-1 sudo[129060]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:52 compute-1 ceph-mon[75484]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80013c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:52 compute-1 sudo[129216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmtdccpdbvkzipyihkyugssjkgulzlsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254472.4595096-543-84309778818246/AnsiballZ_file.py'
Sep 30 17:47:52 compute-1 sudo[129216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:53.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:53 compute-1 python3.9[129218]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:47:53 compute-1 sudo[129216]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:53.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:54 compute-1 python3.9[129369]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:47:54 compute-1 ceph-mon[75484]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:55.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:47:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:55.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:47:55 compute-1 sudo[129521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lebidkibxredzuolqjcoilbtuqkthxim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254475.0488875-623-257483489743547/AnsiballZ_command.py'
Sep 30 17:47:55 compute-1 sudo[129521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:55 compute-1 python3.9[129523]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:55 compute-1 ovs-vsctl[129524]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Sep 30 17:47:55 compute-1 sudo[129521]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:56 compute-1 sudo[129675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bragseuisvlcmtguuytnldiyajaonxzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254475.91024-641-125680754825704/AnsiballZ_command.py'
Sep 30 17:47:56 compute-1 sudo[129675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:56 compute-1 python3.9[129677]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:56 compute-1 sudo[129675]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:56 compute-1 ceph-mon[75484]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80013c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:57 compute-1 sudo[129831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uikcjtjraxvmzcqajbzygjemtawfozqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254476.7249808-657-20409368070880/AnsiballZ_command.py'
Sep 30 17:47:57 compute-1 sudo[129831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:47:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:57.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:47:57 compute-1 python3.9[129833]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:47:57 compute-1 ovs-vsctl[129834]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Sep 30 17:47:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:47:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:57.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:47:57 compute-1 sudo[129831]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:58 compute-1 python3.9[129984]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:58 compute-1 ceph-mon[75484]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:58 compute-1 sudo[130138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dikfyfxqkgpdokiyjzudclzmrgthppqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254478.4632957-691-62147511635072/AnsiballZ_file.py'
Sep 30 17:47:58 compute-1 sudo[130138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:47:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:47:59 compute-1 python3.9[130140]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:47:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:47:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:59 compute-1 sudo[130138]: pam_unix(sudo:session): session closed for user root
Sep 30 17:47:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:47:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:47:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:47:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:47:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:47:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:47:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:47:59.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:47:59 compute-1 sudo[130290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezmdsjimcgiacmnxwxgicciuxprkodvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254479.2627933-707-158379475194625/AnsiballZ_stat.py'
Sep 30 17:47:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:47:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2347 writes, 13K keys, 2347 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.05 MB/s
                                           Cumulative WAL: 2347 writes, 2347 syncs, 1.00 writes per sync, written: 0.03 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2347 writes, 13K keys, 2347 commit groups, 1.0 writes per commit group, ingest: 30.85 MB, 0.05 MB/s
                                           Interval WAL: 2347 writes, 2347 syncs, 1.00 writes per sync, written: 0.03 GB, 0.05 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    158.9      0.11              0.05         5    0.021       0      0       0.0       0.0
                                             L6      1/0   10.67 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.6    159.8    139.8      0.32              0.12         4    0.080     16K   1756       0.0       0.0
                                            Sum      1/0   10.67 MB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.6    119.6    144.6      0.43              0.18         9    0.047     16K   1756       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.6    120.2    145.3      0.42              0.18         8    0.053     16K   1756       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    159.8    139.8      0.32              0.12         4    0.080     16K   1756       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    162.2      0.11              0.05         4    0.026       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.017, interval 0.017
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.06 GB write, 0.10 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.4 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 2.14 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000125 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(133,1.98 MB,0.650295%) FilterBlock(9,51.98 KB,0.0166993%) IndexBlock(9,114.98 KB,0.0369373%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 17:47:59 compute-1 sudo[130290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:47:59 compute-1 python3.9[130292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:47:59 compute-1 sudo[130290]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:00 compute-1 sudo[130369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwcppibwuialhmozvvspnslrdqhnlmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254479.2627933-707-158379475194625/AnsiballZ_file.py'
Sep 30 17:48:00 compute-1 sudo[130369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:00 compute-1 python3.9[130371]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:00 compute-1 sudo[130369]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:00 compute-1 ceph-mon[75484]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8002840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:01.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:01 compute-1 sudo[130526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqxhrkocdmetftumpvspzqutzbwmhsuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254480.6600442-707-260280310892907/AnsiballZ_stat.py'
Sep 30 17:48:01 compute-1 sudo[130526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:01 compute-1 sudo[130520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:48:01 compute-1 sudo[130520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:01 compute-1 sudo[130520]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 sudo[130550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 17:48:01 compute-1 sudo[130550]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:01.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:01 compute-1 python3.9[130544]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:01 compute-1 sudo[130526]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 sudo[130575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:48:01 compute-1 sudo[130575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:01 compute-1 sudo[130575]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 sudo[130550]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 sudo[130695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaiqrkqygittmbjilzzvfkseqpdlnkcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254480.6600442-707-260280310892907/AnsiballZ_file.py'
Sep 30 17:48:01 compute-1 sudo[130695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:01 compute-1 sudo[130698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:48:01 compute-1 sudo[130698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:01 compute-1 sudo[130698]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 python3.9[130697]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:01 compute-1 sudo[130695]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:01 compute-1 sudo[130723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:48:01 compute-1 sudo[130723]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:02 compute-1 sudo[130928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ignkuwhsxmueflimfsgptdpaqhivaiiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254482.0551493-753-12346206592964/AnsiballZ_file.py'
Sep 30 17:48:02 compute-1 sudo[130928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:02 compute-1 sudo[130723]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:02 compute-1 ceph-mon[75484]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:48:02 compute-1 python3.9[130931]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:02 compute-1 sudo[130928]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:03.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:03 compute-1 sudo[131082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hexhjfafmvjwvgxjcwnnphnxhifkdjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254482.8469908-769-218696005221391/AnsiballZ_stat.py'
Sep 30 17:48:03 compute-1 sudo[131082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:03.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:03 compute-1 python3.9[131084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:03 compute-1 sudo[131082]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:48:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.547055) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483547710, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1202, "num_deletes": 251, "total_data_size": 2825188, "memory_usage": 2858872, "flush_reason": "Manual Compaction"}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483561842, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1840436, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12193, "largest_seqno": 13390, "table_properties": {"data_size": 1835193, "index_size": 2703, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11019, "raw_average_key_size": 19, "raw_value_size": 1824684, "raw_average_value_size": 3190, "num_data_blocks": 122, "num_entries": 572, "num_filter_entries": 572, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254385, "oldest_key_time": 1759254385, "file_creation_time": 1759254483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 14861 microseconds, and 9819 cpu microseconds.
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.561908) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1840436 bytes OK
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.561944) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.564001) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.564024) EVENT_LOG_v1 {"time_micros": 1759254483564017, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.564054) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2819403, prev total WAL file size 2819403, number of live WAL files 2.
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.565478) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1797KB)], [24(10MB)]
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483565587, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 13032457, "oldest_snapshot_seqno": -1}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4033 keys, 10322268 bytes, temperature: kUnknown
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483646443, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 10322268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10293101, "index_size": 17985, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 102796, "raw_average_key_size": 25, "raw_value_size": 10217263, "raw_average_value_size": 2533, "num_data_blocks": 762, "num_entries": 4033, "num_filter_entries": 4033, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.646740) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 10322268 bytes
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.648284) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.0 rd, 127.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 10.7 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(12.7) write-amplify(5.6) OK, records in: 4551, records dropped: 518 output_compression: NoCompression
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.648305) EVENT_LOG_v1 {"time_micros": 1759254483648292, "job": 12, "event": "compaction_finished", "compaction_time_micros": 80926, "compaction_time_cpu_micros": 45654, "output_level": 6, "num_output_files": 1, "total_output_size": 10322268, "num_input_records": 4551, "num_output_records": 4033, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483648730, "job": 12, "event": "table_file_deletion", "file_number": 26}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254483651204, "job": 12, "event": "table_file_deletion", "file_number": 24}
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.565346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.651286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.651296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.651300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.651305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:48:03.651309) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:48:03 compute-1 sudo[131160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xajlenmvngwhfswucekazysdxwdnnndc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254482.8469908-769-218696005221391/AnsiballZ_file.py'
Sep 30 17:48:03 compute-1 sudo[131160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:03 compute-1 python3.9[131162]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:03 compute-1 sudo[131160]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:04 compute-1 sudo[131313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahfbvxpqppgpqnxstgwxtsicqagujpkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254484.160354-793-155625000193922/AnsiballZ_stat.py'
Sep 30 17:48:04 compute-1 sudo[131313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:04 compute-1 ceph-mon[75484]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:04 compute-1 python3.9[131315]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80029e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:04 compute-1 sudo[131313]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:05 compute-1 sudo[131392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uinavnqtirdudydhdypftsrmnxlgmktl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254484.160354-793-155625000193922/AnsiballZ_file.py'
Sep 30 17:48:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:05.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:05 compute-1 sudo[131392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:05 compute-1 python3.9[131394]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:05 compute-1 sudo[131392]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:05.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:05 compute-1 sudo[131544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssrvguqkerknqcyfbzhrennmpekpyunl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254485.4664056-817-115412239256412/AnsiballZ_systemd.py'
Sep 30 17:48:05 compute-1 sudo[131544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:06 compute-1 python3.9[131546]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:48:06 compute-1 systemd[1]: Reloading.
Sep 30 17:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:06 compute-1 systemd-rc-local-generator[131576]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:48:06 compute-1 systemd-sysv-generator[131579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:48:06 compute-1 ceph-mon[75484]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:06 compute-1 sudo[131544]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 17:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:07.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:07 compute-1 sudo[131736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tybzyjyqawznrjzunqwrqaymfrgcgkty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254486.8534994-833-32287813286600/AnsiballZ_stat.py'
Sep 30 17:48:07 compute-1 sudo[131736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:07.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:07 compute-1 python3.9[131738]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:07 compute-1 sudo[131736]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:07 compute-1 sudo[131741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:48:07 compute-1 sudo[131741]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:48:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:07 compute-1 sudo[131741]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:48:07 compute-1 sudo[131839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcegpegvlrokfguuoaophbifyvrocoqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254486.8534994-833-32287813286600/AnsiballZ_file.py'
Sep 30 17:48:07 compute-1 sudo[131839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:07 compute-1 python3.9[131841]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:08 compute-1 sudo[131839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:08 compute-1 ceph-mon[75484]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:08 compute-1 sudo[131993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpdwgmgroflualxktrxemzqryrywgvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254488.2542298-857-225051337038877/AnsiballZ_stat.py'
Sep 30 17:48:08 compute-1 sudo[131993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:08 compute-1 python3.9[131995]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:08 compute-1 sudo[131993]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:09.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:09 compute-1 sudo[132071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esnhtjepkitazrcumjwnfxvuhyroccix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254488.2542298-857-225051337038877/AnsiballZ_file.py'
Sep 30 17:48:09 compute-1 sudo[132071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:48:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:09.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:48:09 compute-1 python3.9[132073]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:09 compute-1 sudo[132071]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:09 compute-1 sudo[132223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjxrvzrowmyfborqnuivqwopasbgxfas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254489.5138252-881-267331952862328/AnsiballZ_systemd.py'
Sep 30 17:48:09 compute-1 sudo[132223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:10 compute-1 python3.9[132225]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:48:10 compute-1 systemd[1]: Reloading.
Sep 30 17:48:10 compute-1 systemd-sysv-generator[132259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:48:10 compute-1 systemd-rc-local-generator[132253]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:10 compute-1 ceph-mon[75484]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:10 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:48:10 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:48:10 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:48:10 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:48:10 compute-1 sudo[132223]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:11.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:11.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:11 compute-1 sudo[132420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwosyaebxcxmtiquldaupshhoezalfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254491.0297043-901-124004255954640/AnsiballZ_file.py'
Sep 30 17:48:11 compute-1 sudo[132420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:11 compute-1 python3.9[132422]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:11 compute-1 sudo[132420]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:12 compute-1 sudo[132573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjurqplfrnnoicpqrldfzknxauosaxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254491.81441-917-15246254741042/AnsiballZ_stat.py'
Sep 30 17:48:12 compute-1 sudo[132573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:12 compute-1 python3.9[132575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:12 compute-1 sudo[132573]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:12 compute-1 ceph-mon[75484]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:12 compute-1 sudo[132697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsigiyhkxrhxhsbvrteyhluyeoyvpobi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254491.81441-917-15246254741042/AnsiballZ_copy.py'
Sep 30 17:48:12 compute-1 sudo[132697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:13.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:13 compute-1 python3.9[132699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254491.81441-917-15246254741042/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:13 compute-1 sudo[132697]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:13.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:13 compute-1 sudo[132849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfmlblejizxospxryebszwlzywbmposq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254493.5266788-951-140324528966716/AnsiballZ_file.py'
Sep 30 17:48:13 compute-1 sudo[132849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:14 compute-1 python3.9[132851]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:14 compute-1 sudo[132849]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:14 compute-1 ceph-mon[75484]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:14 compute-1 sudo[133003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvsynodfluhtxnessyeikmkbrxpaoxys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254494.4069397-967-275704333340802/AnsiballZ_stat.py'
Sep 30 17:48:14 compute-1 sudo[133003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:14 compute-1 python3.9[133005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:14 compute-1 sudo[133003]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:15.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:15.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:15 compute-1 sudo[133126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksnvvhvveclkninetzkfzsefpobjeqgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254494.4069397-967-275704333340802/AnsiballZ_copy.py'
Sep 30 17:48:15 compute-1 sudo[133126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:15 compute-1 python3.9[133128]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254494.4069397-967-275704333340802/.source.json _original_basename=.mqt9_avv follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:15 compute-1 sudo[133126]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:16 compute-1 sudo[133279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwnazwugahyxvrcvcejrbhpppubowyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254495.8735158-997-154810698338449/AnsiballZ_file.py'
Sep 30 17:48:16 compute-1 sudo[133279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:16 compute-1 python3.9[133281]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:16 compute-1 sudo[133279]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8003760 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:16 compute-1 ceph-mon[75484]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:16 compute-1 sudo[133432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuzxlhvcbkxxpvvadcxiqldkjhnauleq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254496.6940928-1013-233368178986961/AnsiballZ_stat.py'
Sep 30 17:48:16 compute-1 sudo[133432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:17.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:17 compute-1 sudo[133432]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:17.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:17 compute-1 sudo[133555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjbpbvznxzaqzqjzawgropdhzvbriit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254496.6940928-1013-233368178986961/AnsiballZ_copy.py'
Sep 30 17:48:17 compute-1 sudo[133555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:17 compute-1 sudo[133555]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:18 compute-1 sudo[133708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyvjccpzbrqyzpdirjnvbtyajzipvcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254498.0673103-1047-179089376441684/AnsiballZ_container_config_data.py'
Sep 30 17:48:18 compute-1 sudo[133708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:18 compute-1 python3.9[133710]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Sep 30 17:48:18 compute-1 sudo[133708]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:18 compute-1 ceph-mon[75484]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:19.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:19.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:19 compute-1 sudo[133861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnautyokqbvxwufqxwxfutatsjmqkpti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254499.0497909-1065-261798136432652/AnsiballZ_container_config_hash.py'
Sep 30 17:48:19 compute-1 sudo[133861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:19 compute-1 python3.9[133863]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:48:19 compute-1 sudo[133861]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:19 compute-1 ceph-mon[75484]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:20 compute-1 sudo[134014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saaiqudnqyplbyvetahfcmroionsahqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254499.9888632-1083-189109079646604/AnsiballZ_podman_container_info.py'
Sep 30 17:48:20 compute-1 sudo[134014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:20 compute-1 python3.9[134016]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 17:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:20 compute-1 sudo[134014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80030f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:21.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:21.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:21 compute-1 sudo[134070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:48:21 compute-1 sudo[134070]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:21 compute-1 sudo[134070]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:22 compute-1 sudo[134221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkkjjzhkhlshkyqkszvdfbxorgmcnyjf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254501.5955248-1109-45619860959153/AnsiballZ_edpm_container_manage.py'
Sep 30 17:48:22 compute-1 sudo[134221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:22 compute-1 python3[134223]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:48:22 compute-1 ceph-mon[75484]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:48:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 7445 writes, 30K keys, 7445 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 7445 writes, 1364 syncs, 5.46 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 7445 writes, 30K keys, 7445 commit groups, 1.0 writes per commit group, ingest: 20.63 MB, 0.03 MB/s
                                           Interval WAL: 7445 writes, 1364 syncs, 5.46 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Sep 30 17:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:23.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:23.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:24 compute-1 ceph-mon[75484]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:25.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:25.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:26 compute-1 ceph-mon[75484]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:27.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:27.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:28 compute-1 ceph-mon[75484]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:28 compute-1 sshd-session[134325]: Received disconnect from 194.107.115.65 port 50324:11: Bye Bye [preauth]
Sep 30 17:48:28 compute-1 sshd-session[134325]: Disconnected from authenticating user root 194.107.115.65 port 50324 [preauth]
Sep 30 17:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:28 compute-1 podman[134237]: 2025-09-30 17:48:28.957826704 +0000 UTC m=+6.491496017 image pull ceccf5ef5dadbbaa077cd4e0c11fe3a228fcf6f1eeda53795be19675ca3a7b05 38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 17:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:29.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:29 compute-1 podman[134370]: 2025-09-30 17:48:29.197426813 +0000 UTC m=+0.082895133 container create 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:48:29 compute-1 podman[134370]: 2025-09-30 17:48:29.1562778 +0000 UTC m=+0.041746180 image pull ceccf5ef5dadbbaa077cd4e0c11fe3a228fcf6f1eeda53795be19675ca3a7b05 38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 17:48:29 compute-1 python3[134223]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 17:48:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:29.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:29 compute-1 sudo[134221]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:29 compute-1 sudo[134560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkxtufnajxugwueofyomvgxrxhyaodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254509.631388-1125-199373707119042/AnsiballZ_stat.py'
Sep 30 17:48:30 compute-1 sudo[134560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:30 compute-1 python3.9[134562]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:48:30 compute-1 sudo[134560]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:30 compute-1 ceph-mon[75484]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:30 compute-1 sudo[134715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zijrhbjegtqimsiixbrkwonkvbwtcenj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254510.5411744-1143-18076331233816/AnsiballZ_file.py'
Sep 30 17:48:30 compute-1 sudo[134715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:31.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:31 compute-1 python3.9[134717]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:31 compute-1 sudo[134715]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:48:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:31.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:48:31 compute-1 sudo[134791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvfdqwzzpnzwzenikepcwkwvsektyzxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254510.5411744-1143-18076331233816/AnsiballZ_stat.py'
Sep 30 17:48:31 compute-1 sudo[134791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:31 compute-1 python3.9[134793]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:48:31 compute-1 sudo[134791]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:32 compute-1 sudo[134943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czvphqbmohjgykwjrrxywuiaygkcpovj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254511.882927-1143-253297045092142/AnsiballZ_copy.py'
Sep 30 17:48:32 compute-1 sudo[134943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:32 compute-1 ceph-mon[75484]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:32 compute-1 python3.9[134945]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759254511.882927-1143-253297045092142/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:48:32 compute-1 sudo[134943]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:32 compute-1 sudo[135020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmslxgzsudritpybswlkuqmptwaagyon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254511.882927-1143-253297045092142/AnsiballZ_systemd.py'
Sep 30 17:48:32 compute-1 sudo[135020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:33.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:33 compute-1 python3.9[135022]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:48:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:33 compute-1 systemd[1]: Reloading.
Sep 30 17:48:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:48:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:33.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:48:33 compute-1 systemd-rc-local-generator[135050]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:48:33 compute-1 systemd-sysv-generator[135054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:48:33 compute-1 sudo[135020]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:34 compute-1 sudo[135132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lskckeygjrbgehfsmmkhvezpasgfbpcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254511.882927-1143-253297045092142/AnsiballZ_systemd.py'
Sep 30 17:48:34 compute-1 sudo[135132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:34 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:35.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:35.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:35 compute-1 sshd-session[135145]: Received disconnect from 107.172.146.104 port 41788:11: Bye Bye [preauth]
Sep 30 17:48:35 compute-1 sshd-session[135145]: Disconnected from authenticating user root 107.172.146.104 port 41788 [preauth]
Sep 30 17:48:35 compute-1 python3.9[135134]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:48:35 compute-1 ceph-mon[75484]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:35 compute-1 systemd[1]: Reloading.
Sep 30 17:48:35 compute-1 systemd-sysv-generator[135178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:48:35 compute-1 systemd-rc-local-generator[135173]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:48:35 compute-1 systemd[1]: Starting ovn_controller container...
Sep 30 17:48:36 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:48:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623c4bc826751f7f50e3d99a3d244f80c8026103d9b778d3c9bb5f6610a18cc0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 17:48:36 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc.
Sep 30 17:48:36 compute-1 podman[135188]: 2025-09-30 17:48:36.22227179 +0000 UTC m=+0.196127844 container init 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + sudo -E kolla_set_configs
Sep 30 17:48:36 compute-1 podman[135188]: 2025-09-30 17:48:36.26740486 +0000 UTC m=+0.241260884 container start 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Sep 30 17:48:36 compute-1 edpm-start-podman-container[135188]: ovn_controller
Sep 30 17:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:36 compute-1 systemd[1]: Created slice User Slice of UID 0.
Sep 30 17:48:36 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 17:48:36 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 17:48:36 compute-1 systemd[1]: Starting User Manager for UID 0...
Sep 30 17:48:36 compute-1 systemd[135239]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 17:48:36 compute-1 edpm-start-podman-container[135187]: Creating additional drop-in dependency for "ovn_controller" (93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc)
Sep 30 17:48:36 compute-1 podman[135213]: 2025-09-30 17:48:36.400424417 +0000 UTC m=+0.113617543 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 17:48:36 compute-1 systemd[1]: 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc-514af31645c66ad7.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 17:48:36 compute-1 systemd[1]: 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc-514af31645c66ad7.service: Failed with result 'exit-code'.
Sep 30 17:48:36 compute-1 systemd[1]: Reloading.
Sep 30 17:48:36 compute-1 systemd-rc-local-generator[135290]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:48:36 compute-1 systemd[135239]: Queued start job for default target Main User Target.
Sep 30 17:48:36 compute-1 systemd-sysv-generator[135296]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:48:36 compute-1 systemd[135239]: Created slice User Application Slice.
Sep 30 17:48:36 compute-1 systemd[135239]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 17:48:36 compute-1 systemd[135239]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 17:48:36 compute-1 systemd[135239]: Reached target Paths.
Sep 30 17:48:36 compute-1 systemd[135239]: Reached target Timers.
Sep 30 17:48:36 compute-1 systemd[135239]: Starting D-Bus User Message Bus Socket...
Sep 30 17:48:36 compute-1 systemd[135239]: Starting Create User's Volatile Files and Directories...
Sep 30 17:48:36 compute-1 systemd[135239]: Finished Create User's Volatile Files and Directories.
Sep 30 17:48:36 compute-1 systemd[135239]: Listening on D-Bus User Message Bus Socket.
Sep 30 17:48:36 compute-1 systemd[135239]: Reached target Sockets.
Sep 30 17:48:36 compute-1 systemd[135239]: Reached target Basic System.
Sep 30 17:48:36 compute-1 systemd[135239]: Reached target Main User Target.
Sep 30 17:48:36 compute-1 systemd[135239]: Startup finished in 170ms.
Sep 30 17:48:36 compute-1 systemd[1]: Started User Manager for UID 0.
Sep 30 17:48:36 compute-1 systemd[1]: Started ovn_controller container.
Sep 30 17:48:36 compute-1 systemd[1]: Started Session c1 of User root.
Sep 30 17:48:36 compute-1 sudo[135132]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:36 compute-1 ceph-mon[75484]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:36 compute-1 ovn_controller[135204]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:48:36 compute-1 ovn_controller[135204]: INFO:__main__:Validating config file
Sep 30 17:48:36 compute-1 ovn_controller[135204]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:48:36 compute-1 ovn_controller[135204]: INFO:__main__:Writing out command to execute
Sep 30 17:48:36 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Sep 30 17:48:36 compute-1 ovn_controller[135204]: ++ cat /run_command
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + ARGS=
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + sudo kolla_copy_cacerts
Sep 30 17:48:36 compute-1 systemd[1]: Started Session c2 of User root.
Sep 30 17:48:36 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + [[ ! -n '' ]]
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + . kolla_extend_start
Sep 30 17:48:36 compute-1 ovn_controller[135204]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + umask 0022
Sep 30 17:48:36 compute-1 ovn_controller[135204]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 17:48:36 compute-1 ovn_controller[135204]: 2025-09-30T17:48:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 17:48:36 compute-1 NetworkManager[45549]: <info>  [1759254516.9679] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Sep 30 17:48:36 compute-1 NetworkManager[45549]: <info>  [1759254516.9693] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 17:48:36 compute-1 NetworkManager[45549]: <info>  [1759254516.9714] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Sep 30 17:48:36 compute-1 NetworkManager[45549]: <info>  [1759254516.9725] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Sep 30 17:48:36 compute-1 NetworkManager[45549]: <info>  [1759254516.9732] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 17:48:36 compute-1 kernel: br-int: entered promiscuous mode
Sep 30 17:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:36 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:37 compute-1 systemd-udevd[135365]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:48:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:37.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:37 compute-1 sshd-session[135207]: Invalid user kafka from 84.51.43.58 port 34285
Sep 30 17:48:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:37.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:37 compute-1 sudo[135469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yebrxpvbskydihxiawaklbbxwcigcjvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254517.016609-1199-264816892076504/AnsiballZ_command.py'
Sep 30 17:48:37 compute-1 sudo[135469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:37 compute-1 sshd-session[135207]: Received disconnect from 84.51.43.58 port 34285:11: Bye Bye [preauth]
Sep 30 17:48:37 compute-1 sshd-session[135207]: Disconnected from invalid user kafka 84.51.43.58 port 34285 [preauth]
Sep 30 17:48:37 compute-1 python3.9[135471]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:48:37 compute-1 ovs-vsctl[135472]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Sep 30 17:48:37 compute-1 sudo[135469]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00025|main|INFO|OVS feature set changed, force recompute.
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00034|features|INFO|OVS Feature: group_support, state: supported
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00035|main|INFO|OVS feature set changed, force recompute.
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Sep 30 17:48:37 compute-1 ovn_controller[135204]: 2025-09-30T17:48:37Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Sep 30 17:48:37 compute-1 NetworkManager[45549]: <info>  [1759254517.9927] manager: (ovn-b03989-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Sep 30 17:48:37 compute-1 NetworkManager[45549]: <info>  [1759254517.9936] manager: (ovn-fdf940-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Sep 30 17:48:38 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Sep 30 17:48:38 compute-1 systemd-udevd[135379]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:48:38 compute-1 NetworkManager[45549]: <info>  [1759254518.0173] device (genev_sys_6081): carrier: link connected
Sep 30 17:48:38 compute-1 NetworkManager[45549]: <info>  [1759254518.0177] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Sep 30 17:48:38 compute-1 sudo[135626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spujgrlgwuoqschqykbkmjujqnrziwif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254517.830879-1215-187280529128695/AnsiballZ_command.py'
Sep 30 17:48:38 compute-1 sudo[135626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:38 compute-1 python3.9[135628]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:48:38 compute-1 ovs-vsctl[135630]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Sep 30 17:48:38 compute-1 sudo[135626]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:38 compute-1 ceph-mon[75484]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:38 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:39.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:39 compute-1 sudo[135782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxzfpfnbevnpeemzvkkjnptbackavyok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254518.8247426-1243-52962119809965/AnsiballZ_command.py'
Sep 30 17:48:39 compute-1 sudo[135782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:39.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:39 compute-1 python3.9[135784]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:48:39 compute-1 ovs-vsctl[135785]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Sep 30 17:48:39 compute-1 sudo[135782]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:39 compute-1 sshd-session[124211]: Connection closed by 192.168.122.30 port 44804
Sep 30 17:48:39 compute-1 sshd-session[124208]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:48:39 compute-1 systemd[1]: session-50.scope: Deactivated successfully.
Sep 30 17:48:39 compute-1 systemd[1]: session-50.scope: Consumed 1min 13.897s CPU time.
Sep 30 17:48:39 compute-1 systemd-logind[789]: Session 50 logged out. Waiting for processes to exit.
Sep 30 17:48:39 compute-1 ceph-mon[75484]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:39 compute-1 systemd-logind[789]: Removed session 50.
Sep 30 17:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:40 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:41.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:41.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:41 compute-1 sudo[135812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:48:41 compute-1 sudo[135812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:48:41 compute-1 sudo[135812]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:42 compute-1 ceph-mon[75484]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:42 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:43.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:48:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:43.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:44 compute-1 ceph-mon[75484]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:44 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:45.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:45 compute-1 sshd-session[135842]: Accepted publickey for zuul from 192.168.122.30 port 44518 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:48:45 compute-1 systemd-logind[789]: New session 52 of user zuul.
Sep 30 17:48:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:45 compute-1 systemd[1]: Started Session 52 of User zuul.
Sep 30 17:48:45 compute-1 sshd-session[135842]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:48:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:45.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:46 compute-1 python3.9[135996]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:48:46 compute-1 ceph-mon[75484]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:46 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:47.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:47 compute-1 systemd[1]: Stopping User Manager for UID 0...
Sep 30 17:48:47 compute-1 systemd[135239]: Activating special unit Exit the Session...
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped target Main User Target.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped target Basic System.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped target Paths.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped target Sockets.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped target Timers.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 17:48:47 compute-1 systemd[135239]: Closed D-Bus User Message Bus Socket.
Sep 30 17:48:47 compute-1 systemd[135239]: Stopped Create User's Volatile Files and Directories.
Sep 30 17:48:47 compute-1 systemd[135239]: Removed slice User Application Slice.
Sep 30 17:48:47 compute-1 systemd[135239]: Reached target Shutdown.
Sep 30 17:48:47 compute-1 systemd[135239]: Finished Exit the Session.
Sep 30 17:48:47 compute-1 systemd[135239]: Reached target Exit the Session.
Sep 30 17:48:47 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 17:48:47 compute-1 systemd[1]: Stopped User Manager for UID 0.
Sep 30 17:48:47 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 17:48:47 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 17:48:47 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 17:48:47 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 17:48:47 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 17:48:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:47.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:47 compute-1 sudo[136152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlvpdlqgeaghizheavvvnlpvkajqbrcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254526.9567966-49-22162550111446/AnsiballZ_file.py'
Sep 30 17:48:47 compute-1 sudo[136152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:47 compute-1 python3.9[136154]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:47 compute-1 sudo[136152]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:48 compute-1 sudo[136305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqsjigdvyernbhxnjyldbbplamrkphbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254527.901826-49-187136614493844/AnsiballZ_file.py'
Sep 30 17:48:48 compute-1 sudo[136305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:48 compute-1 ceph-mon[75484]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:48 compute-1 python3.9[136307]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:48 compute-1 sudo[136305]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:48 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:49 compute-1 sudo[136458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqypmsalqwycnvphoknzhlvzzmmtvlqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254528.7009583-49-4279333283062/AnsiballZ_file.py'
Sep 30 17:48:49 compute-1 sudo[136458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:49.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:49 compute-1 python3.9[136460]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:49 compute-1 sudo[136458]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:49.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:49 compute-1 sudo[136610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swhxqpakeeudbbczqeemkwmnmjmfnvcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254529.455748-49-72397978473381/AnsiballZ_file.py'
Sep 30 17:48:49 compute-1 sudo[136610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:50 compute-1 python3.9[136612]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:50 compute-1 sudo[136610]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:50 compute-1 ceph-mon[75484]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:50 compute-1 sudo[136764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-selxjtvcqpjcjzcushwjptqzqheptzss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254530.2646854-49-201031948952839/AnsiballZ_file.py'
Sep 30 17:48:50 compute-1 sudo[136764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:50 compute-1 python3.9[136766]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:50 compute-1 sudo[136764]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:50 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:51.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000010s ======
Sep 30 17:48:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:51.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Sep 30 17:48:51 compute-1 python3.9[136916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:48:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:52 compute-1 sudo[137067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ututebtlkdeahkobsnufrkbexbkxfeqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254531.9184425-137-212791989276543/AnsiballZ_seboolean.py'
Sep 30 17:48:52 compute-1 sudo[137067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:52 compute-1 ceph-mon[75484]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:48:52 compute-1 python3.9[137069]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 17:48:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:52 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d4003e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:53 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:53.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:53 compute-1 sudo[137067]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:53.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:54 compute-1 python3.9[137221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:54 compute-1 ceph-mon[75484]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:48:54 compute-1 python3.9[137343]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254533.4363155-153-88928580017406/.source follow=False _original_basename=haproxy.j2 checksum=95d26d03c70c8c0693c538ed451937f0a3e9bd72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:54 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:55 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:55.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:55.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:55 compute-1 python3.9[137495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:48:55 compute-1 python3.9[137616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254534.8948328-183-9696939807940/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:48:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:56 compute-1 ceph-mon[75484]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:56 compute-1 sudo[137769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmrcsirhdkwljdyiycajvifwzlfmlyge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254536.2890918-217-93061068769156/AnsiballZ_setup.py'
Sep 30 17:48:56 compute-1 sudo[137769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:56 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8002d90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:56 compute-1 python3.9[137771]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:48:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:57 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:48:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:57.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:48:57 compute-1 sudo[137769]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:57.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:48:57 compute-1 sudo[137853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adrvvcmpudpivxivadfwwzebxgjynhpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254536.2890918-217-93061068769156/AnsiballZ_dnf.py'
Sep 30 17:48:57 compute-1 sudo[137853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:48:57 compute-1 python3.9[137855]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:48:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:58 compute-1 ceph-mon[75484]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:48:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:58 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:48:59 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:48:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000010s ======
Sep 30 17:48:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:48:59.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Sep 30 17:48:59 compute-1 sudo[137853]: pam_unix(sudo:session): session closed for user root
Sep 30 17:48:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:48:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:48:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:48:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:48:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:48:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:48:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:48:59.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:00 compute-1 sudo[138009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckrausouwufpuagpxcgvwubolesrlbqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254539.4842439-241-102224465280307/AnsiballZ_systemd.py'
Sep 30 17:49:00 compute-1 sudo[138009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:00 compute-1 python3.9[138011]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:49:00 compute-1 ceph-mon[75484]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:00 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8002d90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:01 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:01.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:01.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:01 compute-1 sudo[138009]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:01 compute-1 sudo[138016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:49:01 compute-1 sudo[138016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:01 compute-1 sudo[138016]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:02 compute-1 python3.9[138191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:02 compute-1 ceph-mon[75484]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:02 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:02 compute-1 python3.9[138313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254541.814761-257-166007463265174/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:03 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:03.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:03.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:03 compute-1 python3.9[138463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:04 compute-1 python3.9[138585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254543.1785166-257-204915360411377/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:04 compute-1 ceph-mon[75484]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:04 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8002d90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:05 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:05.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:05.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:05 compute-1 python3.9[138736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:06 compute-1 python3.9[138858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254545.1848223-345-205897270864668/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:06 compute-1 ovn_controller[135204]: 2025-09-30T17:49:06Z|00038|memory|INFO|15852 kB peak resident set size after 29.6 seconds
Sep 30 17:49:06 compute-1 ovn_controller[135204]: 2025-09-30T17:49:06Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Sep 30 17:49:06 compute-1 podman[138895]: 2025-09-30 17:49:06.642874123 +0000 UTC m=+0.180195087 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:49:06 compute-1 ceph-mon[75484]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:06 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:07 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:07 compute-1 python3.9[139037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:07.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:07 compute-1 python3.9[139158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254546.4655762-345-187865552165269/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:07 compute-1 sudo[139159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:49:07 compute-1 sudo[139159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:07 compute-1 sudo[139159]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:49:07 compute-1 sudo[139200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:49:07 compute-1 sudo[139200]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:08 compute-1 python3.9[139378]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:49:08 compute-1 sudo[139200]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:08 compute-1 ceph-mon[75484]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:49:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:49:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:08 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80037f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:08 compute-1 sudo[139543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xifqxalglwzqqcurcarftsmvrwityihf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254548.6323028-421-259953310045397/AnsiballZ_file.py'
Sep 30 17:49:08 compute-1 sudo[139543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:09 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d0000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000018s ======
Sep 30 17:49:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:09.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000018s
Sep 30 17:49:09 compute-1 python3.9[139545]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:09 compute-1 sudo[139543]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:09.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:09 compute-1 sudo[139695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztjdwghtqaiqzrgnwtzrhqcabttmxfpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254549.3929958-437-48992884291872/AnsiballZ_stat.py'
Sep 30 17:49:09 compute-1 sudo[139695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:09 compute-1 python3.9[139697]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:10 compute-1 sudo[139695]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:10 compute-1 sudo[139776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tglvuhkbywtckiamthdgpwvfznjmlnoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254549.3929958-437-48992884291872/AnsiballZ_file.py'
Sep 30 17:49:10 compute-1 sudo[139776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:10 compute-1 python3.9[139778]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:10 compute-1 sudo[139776]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:10 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:10 compute-1 ceph-mon[75484]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:11 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:11 compute-1 sudo[139929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokoxkslonzqwgcunnvmzcgojprnnzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254550.673614-437-276056527415038/AnsiballZ_stat.py'
Sep 30 17:49:11 compute-1 sudo[139929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:11.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:11 compute-1 python3.9[139931]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:11 compute-1 sudo[139929]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:11.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:11 compute-1 sudo[140007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voqaxgatqrzshrlwzadvlotnlrmiyyob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254550.673614-437-276056527415038/AnsiballZ_file.py'
Sep 30 17:49:11 compute-1 sudo[140007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:11 compute-1 sshd-session[139701]: Received disconnect from 113.249.93.94 port 34700:11: Bye Bye [preauth]
Sep 30 17:49:11 compute-1 sshd-session[139701]: Disconnected from authenticating user root 113.249.93.94 port 34700 [preauth]
Sep 30 17:49:11 compute-1 python3.9[140009]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:11 compute-1 sudo[140007]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:12 compute-1 sudo[140160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzgmtochbkoljtgwxoakavmpgqppenqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254552.0199256-483-13568956012948/AnsiballZ_file.py'
Sep 30 17:49:12 compute-1 sudo[140160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:12 compute-1 python3.9[140162]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:12 compute-1 sudo[140160]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:12 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80037f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:12 compute-1 ceph-mon[75484]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:13 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d00016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:13 compute-1 sudo[140269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:49:13 compute-1 sudo[140269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:13 compute-1 sudo[140269]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:13 compute-1 sudo[140338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-intjgkihlqqyxfbxknpnttsziakvugty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254552.7771614-499-51760150358920/AnsiballZ_stat.py'
Sep 30 17:49:13 compute-1 sudo[140338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000010s ======
Sep 30 17:49:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:13.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000010s
Sep 30 17:49:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:13 compute-1 python3.9[140340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:13 compute-1 sudo[140338]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:13.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:13 compute-1 sudo[140416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdnkzlqbkfrbiffulscqwfsqewwdesme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254552.7771614-499-51760150358920/AnsiballZ_file.py'
Sep 30 17:49:13 compute-1 sudo[140416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:13 compute-1 python3.9[140418]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:49:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:49:13 compute-1 ceph-mon[75484]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:13 compute-1 sudo[140416]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:14 compute-1 sudo[140569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzdkexitlrztcszhunymqfwubauxthxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254554.0811958-523-168159451210471/AnsiballZ_stat.py'
Sep 30 17:49:14 compute-1 sudo[140569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:14 compute-1 python3.9[140571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:14 compute-1 sudo[140569]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:14 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:14 compute-1 sudo[140648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heeofrcnglgbgxznqixnvivzvtstcvuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254554.0811958-523-168159451210471/AnsiballZ_file.py'
Sep 30 17:49:14 compute-1 sudo[140648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:15 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000009s ======
Sep 30 17:49:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:15.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000009s
Sep 30 17:49:15 compute-1 python3.9[140650]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:15 compute-1 sudo[140648]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:15.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:15 compute-1 sudo[140800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfeyoohjgqwljycvfsnbsftqkiubqasg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254555.3777714-547-228728978754900/AnsiballZ_systemd.py'
Sep 30 17:49:15 compute-1 sudo[140800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:16 compute-1 python3.9[140802]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:49:16 compute-1 systemd[1]: Reloading.
Sep 30 17:49:16 compute-1 systemd-rc-local-generator[140827]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:49:16 compute-1 systemd-sysv-generator[140830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:49:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:16 compute-1 ceph-mon[75484]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:16 compute-1 sudo[140800]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:16 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80037f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:17 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d00016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:17 compute-1 sudo[140990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxjteaqygigaekardyluzyzacfylwoym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254556.757836-563-84268522762148/AnsiballZ_stat.py'
Sep 30 17:49:17 compute-1 sudo[140990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 17:49:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:17.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 17:49:17 compute-1 python3.9[140992]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:17 compute-1 sudo[140990]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:17 compute-1 sudo[141070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qekptrzogrsliabislxnziaaelujqhah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254556.757836-563-84268522762148/AnsiballZ_file.py'
Sep 30 17:49:17 compute-1 sudo[141070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:18 compute-1 python3.9[141072]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:18 compute-1 sudo[141070]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:18 compute-1 ceph-mon[75484]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:18 compute-1 sudo[141224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shkovohnmtwnxftjhlqdwaylqulgquyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254558.2913404-587-245630633926295/AnsiballZ_stat.py'
Sep 30 17:49:18 compute-1 sudo[141224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:18 compute-1 sshd-session[140993]: Received disconnect from 175.126.165.170 port 44560:11: Bye Bye [preauth]
Sep 30 17:49:18 compute-1 sshd-session[140993]: Disconnected from authenticating user root 175.126.165.170 port 44560 [preauth]
Sep 30 17:49:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:18 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:18 compute-1 python3.9[141226]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:18 compute-1 sudo[141224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:19 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:19 compute-1 sudo[141302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccpninuazijkdvoggnqwvjhpmrvnigpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254558.2913404-587-245630633926295/AnsiballZ_file.py'
Sep 30 17:49:19 compute-1 sudo[141302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:49:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:19.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:49:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:19.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:19 compute-1 python3.9[141304]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:19 compute-1 sudo[141302]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:20 compute-1 sudo[141455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grofqqyhtezbvrjsmrkpntjxdglzapci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254559.6998255-611-81363884737126/AnsiballZ_systemd.py'
Sep 30 17:49:20 compute-1 sudo[141455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:20 compute-1 python3.9[141457]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:49:20 compute-1 systemd[1]: Reloading.
Sep 30 17:49:20 compute-1 ceph-mon[75484]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:20 compute-1 systemd-rc-local-generator[141483]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:49:20 compute-1 systemd-sysv-generator[141488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:49:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:20 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80037f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:20 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:49:20 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:49:20 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:49:20 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:49:20 compute-1 sudo[141455]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:21 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d00016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:21.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:21.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:21 compute-1 sudo[141650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhshbgmjaimxzzwohkuzxjujocaonyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254561.2064092-631-15867017173181/AnsiballZ_file.py'
Sep 30 17:49:21 compute-1 sudo[141650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:21 compute-1 sudo[141653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:49:21 compute-1 sudo[141653]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:21 compute-1 sudo[141653]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:21 compute-1 python3.9[141652]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:21 compute-1 sudo[141650]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:22 compute-1 sudo[141828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyfcewtwjksxrrfasxzrgkdfjypzmhse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254562.0823495-647-910655518327/AnsiballZ_stat.py'
Sep 30 17:49:22 compute-1 sudo[141828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:22 compute-1 ceph-mon[75484]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:49:22 compute-1 python3.9[141830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:22 compute-1 sudo[141828]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:22 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21dc001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:23 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:23 compute-1 sudo[141952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wssyeyluxijrkxkjyifsutlkyrwtoxyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254562.0823495-647-910655518327/AnsiballZ_copy.py'
Sep 30 17:49:23 compute-1 sudo[141952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:23.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:23 compute-1 python3.9[141954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254562.0823495-647-910655518327/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:23 compute-1 sudo[141952]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:23.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:24 compute-1 sudo[142105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqrxntjmyrldbacfkpbeaksrpgiaeonf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254563.7328668-681-196394831896220/AnsiballZ_file.py'
Sep 30 17:49:24 compute-1 sudo[142105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:24 compute-1 python3.9[142107]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:49:24 compute-1 sudo[142105]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:24 compute-1 ceph-mon[75484]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:24 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:24 compute-1 sudo[142258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehqudxghpfmcrbcveqbslzljaddhdme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254564.612673-697-71513233908125/AnsiballZ_stat.py'
Sep 30 17:49:24 compute-1 sudo[142258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:25 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f80037f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:25 compute-1 python3.9[142260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:49:25 compute-1 sudo[142258]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:25.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:25.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:25 compute-1 sudo[142381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iusjmwlibentelnhtdqibjkcsqemibsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254564.612673-697-71513233908125/AnsiballZ_copy.py'
Sep 30 17:49:25 compute-1 sudo[142381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:25 compute-1 python3.9[142383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254564.612673-697-71513233908125/.source.json _original_basename=.baou62e6 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:25 compute-1 sudo[142381]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:26 compute-1 sudo[142534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlunwdwjklowdhvlbulmwxvzpeuhhsdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254566.0149229-727-174185255518565/AnsiballZ_file.py'
Sep 30 17:49:26 compute-1 sudo[142534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:26 compute-1 python3.9[142536]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:26 compute-1 sudo[142534]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:26 compute-1 ceph-mon[75484]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:26 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:27 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21d0002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:27 compute-1 sudo[142687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phbxpcqhxihhpfediduymsyqhdqobkkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254566.8188567-743-45275969386580/AnsiballZ_stat.py'
Sep 30 17:49:27 compute-1 sudo[142687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:27.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:27 compute-1 sudo[142687]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:27.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:27 compute-1 sudo[142810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfyrmagewawnhqzvbqvaouzpdhjpojzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254566.8188567-743-45275969386580/AnsiballZ_copy.py'
Sep 30 17:49:27 compute-1 sudo[142810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:28 compute-1 sudo[142810]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:28 compute-1 ceph-mon[75484]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:28 compute-1 sudo[142964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dprmzqiwdzohgnuyzaktccbcuzjscqta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254568.3049252-777-63570909463873/AnsiballZ_container_config_data.py'
Sep 30 17:49:28 compute-1 sudo[142964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:28 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2200004a30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:28 compute-1 python3.9[142966]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Sep 30 17:49:29 compute-1 sudo[142964]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:29 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21f8004ce0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:49:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:29.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:49:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:29.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:29 compute-1 sudo[143116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvwofclhnblbrrbmvxliotoljrbljzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254569.2365487-795-138467651150000/AnsiballZ_container_config_hash.py'
Sep 30 17:49:29 compute-1 sudo[143116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:29 compute-1 python3.9[143118]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:49:29 compute-1 sudo[143116]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:30 compute-1 ceph-mon[75484]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:30 compute-1 sudo[143271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymndekaivcemzxxwmofkrczltqpgiodh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254570.2392428-813-9118252681784/AnsiballZ_podman_container_info.py'
Sep 30 17:49:30 compute-1 sudo[143271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:30 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:30 compute-1 python3.9[143273]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 17:49:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:31 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:49:31 compute-1 sudo[143271]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:31.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:31 compute-1 sshd-session[141576]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:49:31 compute-1 sshd-session[141576]: banner exchange: Connection from 101.126.25.120 port 37190: Connection timed out
Sep 30 17:49:31 compute-1 sshd-session[143198]: Connection closed by authenticating user root 192.210.160.141 port 42786 [preauth]
Sep 30 17:49:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:32 compute-1 sshd-session[143379]: Invalid user wifi from 107.172.146.104 port 54022
Sep 30 17:49:32 compute-1 sshd-session[143379]: Received disconnect from 107.172.146.104 port 54022:11: Bye Bye [preauth]
Sep 30 17:49:32 compute-1 sshd-session[143379]: Disconnected from invalid user wifi 107.172.146.104 port 54022 [preauth]
Sep 30 17:49:32 compute-1 sudo[143453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefdqffkqnfypyfbkjpolffjklizlfnh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254571.8801713-839-135759608980622/AnsiballZ_edpm_container_manage.py'
Sep 30 17:49:32 compute-1 sudo[143453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:32 compute-1 ceph-mon[75484]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:32 compute-1 python3[143455]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:49:32 compute-1 kernel: ganesha.nfsd[137369]: segfault at 50 ip 00007f22b204932e sp 00007f22767fb210 error 4 in libntirpc.so.5.8[7f22b202e000+2c000] likely on CPU 4 (core 0, socket 4)
Sep 30 17:49:32 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:49:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[106542]: 30/09/2025 17:49:32 : epoch 68dc1718 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f21e80014b0 fd 47 proxy ignored for local
Sep 30 17:49:32 compute-1 systemd[1]: Started Process Core Dump (PID 143482/UID 0).
Sep 30 17:49:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:33.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:33.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:35 compute-1 sshd-session[143508]: Invalid user minecraft from 194.107.115.65 port 18294
Sep 30 17:49:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:35.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:35 compute-1 sshd-session[143508]: Received disconnect from 194.107.115.65 port 18294:11: Bye Bye [preauth]
Sep 30 17:49:35 compute-1 sshd-session[143508]: Disconnected from invalid user minecraft 194.107.115.65 port 18294 [preauth]
Sep 30 17:49:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:35.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:36 compute-1 systemd-coredump[143489]: Process 106563 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 68:
                                                    #0  0x00007f22b204932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:49:36 compute-1 ceph-mon[75484]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:36 compute-1 systemd[1]: systemd-coredump@3-143482-0.service: Deactivated successfully.
Sep 30 17:49:36 compute-1 systemd[1]: systemd-coredump@3-143482-0.service: Consumed 1.277s CPU time.
Sep 30 17:49:36 compute-1 podman[143533]: 2025-09-30 17:49:36.892459727 +0000 UTC m=+0.065156381 container died ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Sep 30 17:49:37 compute-1 podman[143532]: 2025-09-30 17:49:37.143551933 +0000 UTC m=+0.300553138 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:49:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:37.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:37 compute-1 ceph-mon[75484]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:37.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:49:38 compute-1 ceph-mon[75484]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:39.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:39.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174940 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:49:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:41.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:41.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:41 compute-1 sudo[143595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:49:41 compute-1 sudo[143595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:49:41 compute-1 sudo[143595]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:49:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:43.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:49:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:43.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:43 compute-1 ceph-mon[75484]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:49:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:45 compute-1 ceph-mon[75484]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:45 compute-1 ceph-mon[75484]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:49:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:45.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-e58da4cbfa33f58f7a87c79b79d8a99e7ca4096017057b21bd84b5f97fbd5a16-merged.mount: Deactivated successfully.
Sep 30 17:49:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:45.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:45 compute-1 podman[143533]: 2025-09-30 17:49:45.847879848 +0000 UTC m=+9.020576502 container remove ff723cf9a994b3e1e633bcda921303e92c07ac8160ec8b96b24db0c124c4213a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Sep 30 17:49:45 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:49:46 compute-1 podman[143469]: 2025-09-30 17:49:46.198229864 +0000 UTC m=+13.427057215 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 17:49:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:49:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.282s CPU time.
Sep 30 17:49:46 compute-1 ceph-mon[75484]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:49:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:46 compute-1 podman[143715]: 2025-09-30 17:49:46.450865551 +0000 UTC m=+0.085278638 container create 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Sep 30 17:49:46 compute-1 podman[143715]: 2025-09-30 17:49:46.407281947 +0000 UTC m=+0.041695084 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 17:49:46 compute-1 python3[143455]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 17:49:46 compute-1 sudo[143453]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:47 compute-1 sudo[143903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkwcqkykwyhxtklyqziyrfxylfxyiwfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254586.8400493-855-172433928643116/AnsiballZ_stat.py'
Sep 30 17:49:47 compute-1 sudo[143903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:49:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:47.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:49:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:47.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:47 compute-1 python3.9[143905]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:49:47 compute-1 sudo[143903]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/174948 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/174948 (4) : backend 'backend' has no server available!
Sep 30 17:49:48 compute-1 sudo[144058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlccuujqivqvrdqfdtgggtfmyojfpfsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254587.8235211-873-232062859321761/AnsiballZ_file.py'
Sep 30 17:49:48 compute-1 sudo[144058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:48 compute-1 python3.9[144060]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:48 compute-1 sudo[144058]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:48 compute-1 ceph-mon[75484]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:49:48 compute-1 sudo[144135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahpewzkqdfqakenizzkyfldwlmojzcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254587.8235211-873-232062859321761/AnsiballZ_stat.py'
Sep 30 17:49:48 compute-1 sudo[144135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:48 compute-1 python3.9[144137]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:49:48 compute-1 sudo[144135]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:49.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:49 compute-1 sudo[144289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbacexjnsaytrtiaggetpaqncmcvwkpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254588.9264922-873-186332709734/AnsiballZ_copy.py'
Sep 30 17:49:49 compute-1 sudo[144289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:49:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:49.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:49:49 compute-1 python3.9[144291]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759254588.9264922-873-186332709734/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:49:49 compute-1 sudo[144289]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:49 compute-1 sudo[144366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjhjdoklwianswsnzzuqgzafhrmgtzto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254588.9264922-873-186332709734/AnsiballZ_systemd.py'
Sep 30 17:49:49 compute-1 sudo[144366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:50 compute-1 sshd-session[144138]: Invalid user integral from 84.51.43.58 port 60702
Sep 30 17:49:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:50 compute-1 python3.9[144368]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:49:50 compute-1 sshd-session[144138]: Received disconnect from 84.51.43.58 port 60702:11: Bye Bye [preauth]
Sep 30 17:49:50 compute-1 sshd-session[144138]: Disconnected from invalid user integral 84.51.43.58 port 60702 [preauth]
Sep 30 17:49:50 compute-1 systemd[1]: Reloading.
Sep 30 17:49:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:50 compute-1 systemd-rc-local-generator[144397]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:49:50 compute-1 systemd-sysv-generator[144400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:49:50 compute-1 ceph-mon[75484]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:49:50 compute-1 sudo[144366]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:50 compute-1 sshd-session[144223]: Invalid user debian from 192.210.160.141 port 60274
Sep 30 17:49:50 compute-1 sudo[144479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwndeiqjckqoplxdyrccxnkgfnjtjlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254588.9264922-873-186332709734/AnsiballZ_systemd.py'
Sep 30 17:49:50 compute-1 sudo[144479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:49:50 compute-1 sshd-session[144223]: Connection closed by invalid user debian 192.210.160.141 port 60274 [preauth]
Sep 30 17:49:51 compute-1 python3.9[144481]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:49:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:51.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:51 compute-1 systemd[1]: Reloading.
Sep 30 17:49:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:51 compute-1 systemd-sysv-generator[144515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:49:51 compute-1 systemd-rc-local-generator[144510]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:49:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:51.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:51 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Sep 30 17:49:52 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94e2d0bffd7b4ce70493aa70abfb91ba4a9cf58a10ff3e786240dacba10611a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94e2d0bffd7b4ce70493aa70abfb91ba4a9cf58a10ff3e786240dacba10611a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:52 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09.
Sep 30 17:49:52 compute-1 podman[144521]: 2025-09-30 17:49:52.24405898 +0000 UTC m=+0.552852803 container init 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + sudo -E kolla_set_configs
Sep 30 17:49:52 compute-1 podman[144521]: 2025-09-30 17:49:52.284406181 +0000 UTC m=+0.593199994 container start 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 17:49:52 compute-1 edpm-start-podman-container[144521]: ovn_metadata_agent
Sep 30 17:49:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Validating config file
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Copying service configuration files
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Writing out command to execute
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/external
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: ++ cat /run_command
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + CMD=neutron-ovn-metadata-agent
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + ARGS=
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + sudo kolla_copy_cacerts
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + [[ ! -n '' ]]
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + . kolla_extend_start
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: Running command: 'neutron-ovn-metadata-agent'
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + umask 0022
Sep 30 17:49:52 compute-1 podman[144545]: 2025-09-30 17:49:52.395191787 +0000 UTC m=+0.088483311 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 17:49:52 compute-1 ovn_metadata_agent[144538]: + exec neutron-ovn-metadata-agent
Sep 30 17:49:52 compute-1 edpm-start-podman-container[144520]: Creating additional drop-in dependency for "ovn_metadata_agent" (64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09)
Sep 30 17:49:52 compute-1 systemd[1]: Reloading.
Sep 30 17:49:52 compute-1 systemd-sysv-generator[144617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:49:52 compute-1 systemd-rc-local-generator[144611]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:49:52 compute-1 ceph-mon[75484]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:49:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:49:52 compute-1 sshd-session[144626]: Invalid user admin from 167.71.248.239 port 50330
Sep 30 17:49:52 compute-1 systemd[1]: Started ovn_metadata_agent container.
Sep 30 17:49:52 compute-1 sshd-session[144626]: Connection closed by invalid user admin 167.71.248.239 port 50330 [preauth]
Sep 30 17:49:52 compute-1 sudo[144479]: pam_unix(sudo:session): session closed for user root
Sep 30 17:49:53 compute-1 sshd-session[135845]: Connection closed by 192.168.122.30 port 44518
Sep 30 17:49:53 compute-1 sshd-session[135842]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:49:53 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Sep 30 17:49:53 compute-1 systemd[1]: session-52.scope: Consumed 1min 4.500s CPU time.
Sep 30 17:49:53 compute-1 systemd-logind[789]: Session 52 logged out. Waiting for processes to exit.
Sep 30 17:49:53 compute-1 systemd-logind[789]: Removed session 52.
Sep 30 17:49:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:53.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:53.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.260 144543 INFO neutron.common.config [-] Logging enabled!
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.260 144543 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.260 144543 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.261 144543 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.262 144543 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.263 144543 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.264 144543 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.265 144543 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.102 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.266 144543 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.267 144543 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.268 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.269 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.270 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.271 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.272 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.273 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.274 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.275 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.276 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.277 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.278 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.279 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.280 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.281 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.282 144543 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.283 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.284 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.285 144543 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 17:49:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.357 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.357 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.357 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.357 144543 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.358 144543 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.371 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 81ab3fff-d6d4-4262-9f24-1b212876e52c (UUID: 81ab3fff-d6d4-4262-9f24-1b212876e52c) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.398 144543 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.398 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.398 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.398 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.398 144543 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.402 144543 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.407 144543 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.413 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '81ab3fff-d6d4-4262-9f24-1b212876e52c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], external_ids={}, name=81ab3fff-d6d4-4262-9f24-1b212876e52c, nb_cfg_timestamp=1759254525984, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 17:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:54.416 144543 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp2cbgcvl_/privsep.sock']
Sep 30 17:49:54 compute-1 ceph-mon[75484]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:49:55 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Sep 30 17:49:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.195 144543 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.195 144543 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2cbgcvl_/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.033 144666 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.037 144666 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.038 144666 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.039 144666 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144666
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.198 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f8a73b-5414-4f1a-867e-af3fe8584ea7]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 17:49:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:55.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:55.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.656 144666 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.656 144666 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:49:55 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:55.656 144666 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.165 144666 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.169 144666 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.206 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f15ce3-b178-4340-8551-fc5c859b0159]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.207 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, column=external_ids, values=({'neutron:ovn-metadata-id': 'eb3bcc57-5d92-5eae-a481-1f99a3f3f1be'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.217 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 17:49:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:49:56.224 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 17:49:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:56 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 4.
Sep 30 17:49:56 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:49:56 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.282s CPU time.
Sep 30 17:49:56 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:49:56 compute-1 ceph-mon[75484]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:49:56 compute-1 podman[144720]: 2025-09-30 17:49:56.827732313 +0000 UTC m=+0.101777389 container create b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:49:56 compute-1 podman[144720]: 2025-09-30 17:49:56.748930693 +0000 UTC m=+0.022975759 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:49:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/530c700a22afaf6f519c2d3535124a430755c8667caedefee4f3d1ebd45c4e0f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/530c700a22afaf6f519c2d3535124a430755c8667caedefee4f3d1ebd45c4e0f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/530c700a22afaf6f519c2d3535124a430755c8667caedefee4f3d1ebd45c4e0f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/530c700a22afaf6f519c2d3535124a430755c8667caedefee4f3d1ebd45c4e0f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:49:56 compute-1 podman[144720]: 2025-09-30 17:49:56.982015213 +0000 UTC m=+0.256060359 container init b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Sep 30 17:49:56 compute-1 podman[144720]: 2025-09-30 17:49:56.991483476 +0000 UTC m=+0.265528562 container start b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:49:56 compute-1 bash[144720]: b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6
Sep 30 17:49:57 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:49:57 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:49:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:57.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:58 compute-1 sshd-session[144779]: Accepted publickey for zuul from 192.168.122.30 port 37688 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:49:58 compute-1 systemd-logind[789]: New session 53 of user zuul.
Sep 30 17:49:58 compute-1 ceph-mon[75484]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:49:58 compute-1 systemd[1]: Started Session 53 of User zuul.
Sep 30 17:49:58 compute-1 sshd-session[144779]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:49:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:49:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:49:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:49:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:49:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:49:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:49:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:49:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:49:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:49:59.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:49:59 compute-1 python3.9[144932]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:50:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:00 compute-1 sudo[145088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkyqynftbxejstglrxutfrdnttyygpfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254600.2051675-49-17931195287785/AnsiballZ_command.py'
Sep 30 17:50:00 compute-1 sudo[145088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:00 compute-1 ceph-mon[75484]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:50:00 compute-1 ceph-mon[75484]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 17:50:00 compute-1 ceph-mon[75484]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 17:50:00 compute-1 ceph-mon[75484]:      osd.1 observed slow operation indications in BlueStore
Sep 30 17:50:00 compute-1 python3.9[145090]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:01 compute-1 sudo[145088]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:01.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:01.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:01 compute-1 sudo[145194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:50:01 compute-1 sudo[145194]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:01 compute-1 sudo[145194]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:02 compute-1 sudo[145280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyacbjlsgjgnatadgwzidrliwhfbxato ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254601.4110103-71-274334753667677/AnsiballZ_systemd_service.py'
Sep 30 17:50:02 compute-1 sudo[145280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:02 compute-1 python3.9[145282]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:50:02 compute-1 systemd[1]: Reloading.
Sep 30 17:50:02 compute-1 systemd-rc-local-generator[145304]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:50:02 compute-1 systemd-sysv-generator[145310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:50:02 compute-1 ceph-mon[75484]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:50:02 compute-1 sudo[145280]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:03 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:03 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:50:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:03.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:03 compute-1 python3.9[145468]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:50:03 compute-1 network[145485]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:50:03 compute-1 network[145486]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:50:03 compute-1 network[145487]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:50:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:04 compute-1 ceph-mon[75484]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:50:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:05.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:05.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175006 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:50:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:06 compute-1 ceph-mon[75484]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:50:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:07.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:07.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:50:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:08 compute-1 ceph-mon[75484]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:09 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:09.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:10 compute-1 ceph-mon[75484]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.9 KiB/s rd, 853 B/s wr, 2 op/s
Sep 30 17:50:10 compute-1 sudo[145772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdhlviwtsqbznghzedszkqdsnopkqiyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254609.8293202-109-128813119857717/AnsiballZ_systemd_service.py'
Sep 30 17:50:10 compute-1 sudo[145772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:10 compute-1 python3.9[145774]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:10 compute-1 sudo[145772]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:10 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60f0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:10 compute-1 sudo[145929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjlxqwwpkfplhkwlzdpodqwkvuthmrfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254610.6814895-109-215946230164547/AnsiballZ_systemd_service.py'
Sep 30 17:50:10 compute-1 sudo[145929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:11 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60e8001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:11.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:11 compute-1 python3.9[145931]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:11 compute-1 sudo[145929]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:11.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:11 compute-1 sshd-session[145618]: Connection closed by authenticating user root 192.210.160.141 port 43684 [preauth]
Sep 30 17:50:11 compute-1 sudo[146082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzsngklemdkvdhozguztuatzxunfllsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254611.4943273-109-56266729268088/AnsiballZ_systemd_service.py'
Sep 30 17:50:11 compute-1 sudo[146082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:12 compute-1 python3.9[146084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:12 compute-1 sudo[146082]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:12 compute-1 sudo[146237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slxsyodgyhzcxfcyhbiopvkbvhehrkef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254612.4568446-109-255995221762553/AnsiballZ_systemd_service.py'
Sep 30 17:50:12 compute-1 sudo[146237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175012 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:12 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60e8001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:13 compute-1 ceph-mon[75484]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Sep 30 17:50:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:13 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60e8001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:13 compute-1 python3.9[146239]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:13 compute-1 sudo[146240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:50:13 compute-1 sudo[146240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:13 compute-1 sudo[146240]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:13 compute-1 sudo[146237]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:13.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:13 compute-1 sudo[146267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:50:13 compute-1 sudo[146267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:13 compute-1 podman[146264]: 2025-09-30 17:50:13.363439474 +0000 UTC m=+0.147257360 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 17:50:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:13.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:13 compute-1 sudo[146486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qasreprorkdtjthgwojbfediuraeijwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254613.414268-109-98828334611367/AnsiballZ_systemd_service.py'
Sep 30 17:50:13 compute-1 sudo[146486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:13 compute-1 sudo[146267]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:13 compute-1 ceph-mon[75484]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 70 KiB/s rd, 511 B/s wr, 116 op/s
Sep 30 17:50:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:50:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:50:14 compute-1 python3.9[146489]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:14 compute-1 sudo[146486]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:14 compute-1 sudo[146656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuwrykdexknuyofvvephplpgdlhwaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254614.3029888-109-144141258335724/AnsiballZ_systemd_service.py'
Sep 30 17:50:14 compute-1 sudo[146656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:14 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60d0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:50:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:50:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:50:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:50:14 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:50:14 compute-1 python3.9[146658]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:15 compute-1 sudo[146656]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:15 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60c4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:15.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:15 compute-1 sudo[146809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noewtjqhhekhiapkngokjgfocnakyyuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254615.1703212-109-36247448280319/AnsiballZ_systemd_service.py'
Sep 30 17:50:15 compute-1 sudo[146809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:15 compute-1 python3.9[146811]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:50:15 compute-1 sudo[146809]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:15 compute-1 ceph-mon[75484]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 85 B/s wr, 115 op/s
Sep 30 17:50:16 compute-1 sshd-session[146648]: Received disconnect from 216.10.242.161 port 42288:11: Bye Bye [preauth]
Sep 30 17:50:16 compute-1 sshd-session[146648]: Disconnected from authenticating user root 216.10.242.161 port 42288 [preauth]
Sep 30 17:50:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:16 compute-1 sudo[146964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfjkatlzhzdaovktunugyaqceopckpqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254616.2217526-213-116939591980519/AnsiballZ_file.py'
Sep 30 17:50:16 compute-1 sudo[146964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:16 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60c4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:16 compute-1 python3.9[146966]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:16 compute-1 sudo[146964]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:17 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60e8001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:17 compute-1 sudo[147116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjavwzrvdosruxfzhkbekhmxruheaqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254617.0827794-213-10990924283090/AnsiballZ_file.py'
Sep 30 17:50:17 compute-1 sudo[147116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:17.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:17 compute-1 python3.9[147118]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:17 compute-1 sudo[147116]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:18 compute-1 sudo[147269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjkzvsekxgbqgfgwkgemgujizojehuep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254617.963698-213-133628590349661/AnsiballZ_file.py'
Sep 30 17:50:18 compute-1 sudo[147269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:18 compute-1 python3.9[147271]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:18 compute-1 sudo[147269]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:18 compute-1 ceph-mon[75484]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 85 B/s wr, 115 op/s
Sep 30 17:50:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[144735]: 30/09/2025 17:50:18 : epoch 68dc1845 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f60d0001ac0 fd 38 proxy ignored for local
Sep 30 17:50:18 compute-1 kernel: ganesha.nfsd[145607]: segfault at 50 ip 00007f619a2e332e sp 00007f615b7fd210 error 4 in libntirpc.so.5.8[7f619a2c8000+2c000] likely on CPU 4 (core 0, socket 4)
Sep 30 17:50:18 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:50:18 compute-1 systemd[1]: Started Process Core Dump (PID 147326/UID 0).
Sep 30 17:50:19 compute-1 sudo[147424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daeaeovizuztyfcgepxoqwhlisrhvfdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254618.8389945-213-56007145706711/AnsiballZ_file.py'
Sep 30 17:50:19 compute-1 sudo[147424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:19.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:19 compute-1 python3.9[147426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:19 compute-1 sudo[147424]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:20 compute-1 sudo[147577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szkywvyazaphpdezznxleuirdgsngszm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254619.964267-213-158245382373139/AnsiballZ_file.py'
Sep 30 17:50:20 compute-1 sudo[147577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:21 compute-1 python3.9[147579]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:21 compute-1 sudo[147577]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:21.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:21.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:21 compute-1 systemd-coredump[147341]: Process 144739 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007f619a2e332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:50:21 compute-1 sudo[147731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bullsjyunrenznqfcdnazchduggytiju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254621.3499362-213-69863085539003/AnsiballZ_file.py'
Sep 30 17:50:21 compute-1 sudo[147731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:21 compute-1 systemd[1]: systemd-coredump@4-147326-0.service: Deactivated successfully.
Sep 30 17:50:21 compute-1 systemd[1]: systemd-coredump@4-147326-0.service: Consumed 1.322s CPU time.
Sep 30 17:50:21 compute-1 podman[147737]: 2025-09-30 17:50:21.941157804 +0000 UTC m=+0.059771363 container died b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:50:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:22 compute-1 python3.9[147733]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:22 compute-1 sudo[147731]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:22 compute-1 ceph-mon[75484]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 85 B/s wr, 115 op/s
Sep 30 17:50:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-530c700a22afaf6f519c2d3535124a430755c8667caedefee4f3d1ebd45c4e0f-merged.mount: Deactivated successfully.
Sep 30 17:50:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:22 compute-1 sudo[147829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:50:22 compute-1 sudo[147829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:22 compute-1 sudo[147829]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:22 compute-1 podman[147737]: 2025-09-30 17:50:22.529378953 +0000 UTC m=+0.647992502 container remove b3c793c64d186704effe09e0fd5329a0e13b83e69e37da05c2ddcc90a1bb22d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:50:22 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:50:22 compute-1 sudo[147885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:50:22 compute-1 sudo[147885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:22 compute-1 sudo[147885]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:22 compute-1 podman[147845]: 2025-09-30 17:50:22.613102724 +0000 UTC m=+0.158868732 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 17:50:22 compute-1 sudo[147986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxboubvhugzaoojvrxyqmvsplbvzcrqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254622.3290033-213-7603067915250/AnsiballZ_file.py'
Sep 30 17:50:22 compute-1 sudo[147986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:22 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:50:22 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.802s CPU time.
Sep 30 17:50:22 compute-1 python3.9[147996]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:22 compute-1 sudo[147986]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:50:23 compute-1 ceph-mon[75484]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 0 B/s wr, 115 op/s
Sep 30 17:50:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:50:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:50:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:23.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:23 compute-1 sudo[148153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxybswzhdbpdkjcgqfaevyrxdlfywcyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254623.1063192-313-248276305427966/AnsiballZ_file.py'
Sep 30 17:50:23 compute-1 sudo[148153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:23.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:23 compute-1 python3.9[148155]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:23 compute-1 sudo[148153]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:24 compute-1 sudo[148306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pilseololdtodossactfdaajsguxoiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254623.8110123-313-80782590998524/AnsiballZ_file.py'
Sep 30 17:50:24 compute-1 sudo[148306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:24 compute-1 ceph-mon[75484]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 0 B/s wr, 115 op/s
Sep 30 17:50:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:24 compute-1 python3.9[148308]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:24 compute-1 sudo[148306]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:24 compute-1 sudo[148461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vihgwmwqjiwuwkmnjvbidrxifbkpawzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254624.5366135-313-7091603330757/AnsiballZ_file.py'
Sep 30 17:50:24 compute-1 sudo[148461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:25 compute-1 python3.9[148463]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:25 compute-1 sudo[148461]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:25.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:25.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:25 compute-1 sudo[148613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iixuuthdzvaeyspbsiuwbrvvtvsreicg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254625.1634102-313-36443526810217/AnsiballZ_file.py'
Sep 30 17:50:25 compute-1 sudo[148613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:25 compute-1 python3.9[148615]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:25 compute-1 sudo[148613]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:26 compute-1 sudo[148766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qivbtdadxlxdwoyfuosyufdbqnslzvth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254625.9106524-313-76868123837534/AnsiballZ_file.py'
Sep 30 17:50:26 compute-1 sudo[148766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:26 compute-1 python3.9[148768]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:26 compute-1 sudo[148766]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:26 compute-1 ceph-mon[75484]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:50:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175026 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:50:26 compute-1 sudo[148919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnufafyneigiplniahwvskvhdaqmyibs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254626.5865414-313-74303218872507/AnsiballZ_file.py'
Sep 30 17:50:26 compute-1 sudo[148919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:26 compute-1 sshd-session[148441]: Received disconnect from 113.249.93.94 port 51814:11: Bye Bye [preauth]
Sep 30 17:50:26 compute-1 sshd-session[148441]: Disconnected from authenticating user root 113.249.93.94 port 51814 [preauth]
Sep 30 17:50:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:27 compute-1 python3.9[148921]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:27 compute-1 sudo[148919]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:27.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:27.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:27 compute-1 sudo[149071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcqkbcumulxmrkanetzdittgvxkwmpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254627.2552016-313-101067154451409/AnsiballZ_file.py'
Sep 30 17:50:27 compute-1 sudo[149071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:27 compute-1 python3.9[149073]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:50:27 compute-1 sudo[149071]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:28 compute-1 sudo[149224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noorgnyhqllhiiqhkgyredmiuxblficj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254628.1000755-415-76978543938851/AnsiballZ_command.py'
Sep 30 17:50:28 compute-1 sudo[149224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:28 compute-1 ceph-mon[75484]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:50:28 compute-1 python3.9[149226]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:28 compute-1 sudo[149224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:29.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:29.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:29 compute-1 python3.9[149381]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:50:30 compute-1 sudo[149533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rblrayimiqbllfuowuzzheshrfvwbpsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254629.9072852-451-45822806028392/AnsiballZ_systemd_service.py'
Sep 30 17:50:30 compute-1 sudo[149533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:30 compute-1 ceph-mon[75484]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:50:30 compute-1 python3.9[149535]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:50:30 compute-1 sshd-session[149329]: Received disconnect from 175.126.165.170 port 60300:11: Bye Bye [preauth]
Sep 30 17:50:30 compute-1 sshd-session[149329]: Disconnected from authenticating user root 175.126.165.170 port 60300 [preauth]
Sep 30 17:50:30 compute-1 systemd[1]: Reloading.
Sep 30 17:50:30 compute-1 systemd-rc-local-generator[149569]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:50:30 compute-1 systemd-sysv-generator[149573]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:50:30 compute-1 sshd-session[149539]: Invalid user test from 107.172.146.104 port 41796
Sep 30 17:50:30 compute-1 sudo[149533]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:30 compute-1 sshd-session[149539]: Received disconnect from 107.172.146.104 port 41796:11: Bye Bye [preauth]
Sep 30 17:50:30 compute-1 sshd-session[149539]: Disconnected from invalid user test 107.172.146.104 port 41796 [preauth]
Sep 30 17:50:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:31.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:31 compute-1 sudo[149726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwtjytjijrgxijrpwdiqjgrtyercwdef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254631.0873165-467-133512021465463/AnsiballZ_command.py'
Sep 30 17:50:31 compute-1 sudo[149726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:31.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:31 compute-1 python3.9[149728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:31 compute-1 sudo[149726]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:32 compute-1 sshd-session[149406]: Connection closed by authenticating user root 192.210.160.141 port 35218 [preauth]
Sep 30 17:50:32 compute-1 sudo[149880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uabybcuxojktlzwtpsjybzqozmydpggn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254631.8263211-467-99174538804580/AnsiballZ_command.py'
Sep 30 17:50:32 compute-1 sudo[149880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:32 compute-1 python3.9[149882]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:32 compute-1 sudo[149880]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:32 compute-1 ceph-mon[75484]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:50:32 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 5.
Sep 30 17:50:32 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:50:32 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.802s CPU time.
Sep 30 17:50:32 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:50:32 compute-1 sudo[150046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlnrcytjecyapopthywfqjtfneapvzln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254632.613816-467-60830241947096/AnsiballZ_command.py'
Sep 30 17:50:32 compute-1 sudo[150046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:33 compute-1 podman[150083]: 2025-09-30 17:50:33.164940683 +0000 UTC m=+0.050367243 container create f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:50:33 compute-1 python3.9[150051]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278807b49319efd3aaf6934853ab45e872d95cedac1a1ac57cdf9717c82dad26/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:50:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278807b49319efd3aaf6934853ab45e872d95cedac1a1ac57cdf9717c82dad26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:50:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278807b49319efd3aaf6934853ab45e872d95cedac1a1ac57cdf9717c82dad26/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:50:33 compute-1 sudo[150046]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278807b49319efd3aaf6934853ab45e872d95cedac1a1ac57cdf9717c82dad26/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:50:33 compute-1 podman[150083]: 2025-09-30 17:50:33.225091638 +0000 UTC m=+0.110518228 container init f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Sep 30 17:50:33 compute-1 podman[150083]: 2025-09-30 17:50:33.13999941 +0000 UTC m=+0.025426010 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:50:33 compute-1 podman[150083]: 2025-09-30 17:50:33.238556932 +0000 UTC m=+0.123983492 container start f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:50:33 compute-1 bash[150083]: f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:50:33 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:50:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000034s ======
Sep 30 17:50:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:33.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:50:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:33.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:33 compute-1 sudo[150290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeauzgtyhjwtnbkraxihyqkrqdfueisk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254633.372014-467-168445767824078/AnsiballZ_command.py'
Sep 30 17:50:33 compute-1 sudo[150290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:33 compute-1 python3.9[150292]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:33 compute-1 sudo[150290]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:34 compute-1 sudo[150444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufsfrxipjxqbvdsehkngxbpnjwzaogyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254634.1529124-467-222544650700810/AnsiballZ_command.py'
Sep 30 17:50:34 compute-1 sudo[150444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:34 compute-1 ceph-mon[75484]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:50:34 compute-1 python3.9[150446]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:34 compute-1 sudo[150444]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:35 compute-1 sudo[150598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuqgfouhthufdxxwynhezsfgevtvqyij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254634.8968492-467-196281826503028/AnsiballZ_command.py'
Sep 30 17:50:35 compute-1 sudo[150598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:35.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:35 compute-1 python3.9[150600]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:35.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:35 compute-1 sudo[150598]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:35 compute-1 sudo[150751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iezjmzxweeorjnbnxxzodzhdapudwbps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254635.662953-467-129002203553752/AnsiballZ_command.py'
Sep 30 17:50:35 compute-1 sudo[150751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:36 compute-1 python3.9[150754]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:50:36 compute-1 sudo[150751]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:36 compute-1 ceph-mon[75484]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:50:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:37 compute-1 sudo[150907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wehofekawekxsfgospailhwxvvxxttlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254636.6335723-575-148676199383332/AnsiballZ_getent.py'
Sep 30 17:50:37 compute-1 sudo[150907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:37.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:37 compute-1 python3.9[150909]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Sep 30 17:50:37 compute-1 sudo[150907]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:37.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:50:38 compute-1 sudo[151061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwrebdvimudnvhvmvzxbvtzqhwkbdcfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254637.6686609-591-1322976535347/AnsiballZ_group.py'
Sep 30 17:50:38 compute-1 sudo[151061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:38 compute-1 python3.9[151063]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:50:38 compute-1 groupadd[151064]: group added to /etc/group: name=libvirt, GID=42473
Sep 30 17:50:38 compute-1 groupadd[151064]: group added to /etc/gshadow: name=libvirt
Sep 30 17:50:38 compute-1 groupadd[151064]: new group: name=libvirt, GID=42473
Sep 30 17:50:38 compute-1 sudo[151061]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:38 compute-1 ceph-mon[75484]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:50:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000033s ======
Sep 30 17:50:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:39.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Sep 30 17:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:39 compute-1 sudo[151220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbatvidjfyyrhfrwnnhtnqyttqdwtbua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254638.8350058-607-138079462101537/AnsiballZ_user.py'
Sep 30 17:50:39 compute-1 sudo[151220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:39 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:39 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:50:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:39.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:39 compute-1 python3.9[151222]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 17:50:39 compute-1 useradd[151224]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 17:50:39 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:50:39 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:50:39 compute-1 sudo[151220]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:40 compute-1 sudo[151383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nflsuixrhfsedzxdjqujczokkdyvohbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254640.3046432-629-104308681496701/AnsiballZ_setup.py'
Sep 30 17:50:40 compute-1 sudo[151383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:40 compute-1 python3.9[151385]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:50:40 compute-1 ceph-mon[75484]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:50:41 compute-1 sudo[151383]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:41.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:41.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:41 compute-1 sudo[151467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkulluugkmwqgdgsuaylyfeukurpjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254640.3046432-629-104308681496701/AnsiballZ_dnf.py'
Sep 30 17:50:41 compute-1 sudo[151467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:50:41 compute-1 python3.9[151469]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:50:42 compute-1 ceph-mon[75484]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:50:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:42 compute-1 sudo[151472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:50:42 compute-1 sudo[151472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:50:42 compute-1 sudo[151472]: pam_unix(sudo:session): session closed for user root
Sep 30 17:50:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:43.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:43.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:43 compute-1 podman[151501]: 2025-09-30 17:50:43.588725857 +0000 UTC m=+0.126796016 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:50:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:44 compute-1 ceph-mon[75484]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:50:44 compute-1 sshd-session[151528]: Received disconnect from 194.107.115.65 port 42768:11: Bye Bye [preauth]
Sep 30 17:50:44 compute-1 sshd-session[151528]: Disconnected from authenticating user root 194.107.115.65 port 42768 [preauth]
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:45.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:45.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175046 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/175046 (4) : backend 'backend' has no server available!
Sep 30 17:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:46 compute-1 ceph-mon[75484]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 767 B/s wr, 1 op/s
Sep 30 17:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:46 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba30000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:47 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:47.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:47.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:47 compute-1 sshd-session[150886]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:50:47 compute-1 sshd-session[150886]: banner exchange: Connection from 14.103.129.43 port 33520: Connection timed out
Sep 30 17:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:48 compute-1 ceph-mon[75484]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 767 B/s wr, 1 op/s
Sep 30 17:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175048 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:48 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:49 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba30000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:49.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:49.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:50 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:50 compute-1 ceph-mon[75484]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:50:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:51 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:50:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:51.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:50:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:51.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:52 compute-1 ceph-mon[75484]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:50:52 compute-1 sshd-session[151561]: Connection closed by authenticating user root 192.210.160.141 port 59530 [preauth]
Sep 30 17:50:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:52 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:50:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:53 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba30000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:53.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:53.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:53 compute-1 podman[151565]: 2025-09-30 17:50:53.546002279 +0000 UTC m=+0.082407657 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 17:50:54 compute-1 ceph-mon[75484]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Sep 30 17:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:50:54.286 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:50:54.287 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:50:54.287 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:50:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:54 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:55 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:55.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:55.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:55 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:50:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:56 compute-1 ceph-mon[75484]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:50:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:56 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:50:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:57 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba30000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:57.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:57.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:58 compute-1 ceph-mon[75484]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:58 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:58 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:58 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:58 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:50:59 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:50:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:50:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:50:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:50:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:50:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:50:59.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:50:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:50:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:50:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:50:59.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:00 compute-1 ceph-mon[75484]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:51:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:00 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:01 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba300091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:01.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:01.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:01 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:51:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:02 compute-1 ceph-mon[75484]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:51:02 compute-1 sudo[151762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:51:02 compute-1 sudo[151762]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:02 compute-1 sudo[151762]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:02 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba300091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:03 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:03.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:03 compute-1 sshd-session[151756]: Invalid user spooler from 84.51.43.58 port 54769
Sep 30 17:51:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:03.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:03 compute-1 sshd-session[151756]: Received disconnect from 84.51.43.58 port 54769:11: Bye Bye [preauth]
Sep 30 17:51:03 compute-1 sshd-session[151756]: Disconnected from invalid user spooler 84.51.43.58 port 54769 [preauth]
Sep 30 17:51:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:04 compute-1 ceph-mon[75484]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:51:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:04 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:05 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:05.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:05.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:06 compute-1 ceph-mon[75484]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:06 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba300091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:07 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:07.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:07.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175108 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:08 compute-1 ceph-mon[75484]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:08 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:09 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:09.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:09.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:10 compute-1 ceph-mon[75484]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:10 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:10 compute-1 sshd-session[151806]: Connection closed by authenticating user root 192.210.160.141 port 54144 [preauth]
Sep 30 17:51:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:11 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:11.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:11.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:12 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:12 compute-1 ceph-mon[75484]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:51:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:13 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:13.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:13.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:14 compute-1 ceph-mon[75484]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 341 B/s wr, 1 op/s
Sep 30 17:51:14 compute-1 podman[151814]: 2025-09-30 17:51:14.255197767 +0000 UTC m=+0.114989008 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 17:51:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:14 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:15 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:15.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:15.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:16 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:16 compute-1 ceph-mon[75484]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:51:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:17 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:17.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:17.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:17 compute-1 kernel: SELinux:  Converting 2771 SID table entries...
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:51:17 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:51:18 compute-1 ceph-mon[75484]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:51:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:18 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:19 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:19.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:19.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:20 compute-1 ceph-mon[75484]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:51:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:20 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:21 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:21.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:21.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:22 compute-1 sshd-session[151857]: Invalid user sanjay from 107.172.146.104 port 45526
Sep 30 17:51:22 compute-1 sshd-session[151857]: Received disconnect from 107.172.146.104 port 45526:11: Bye Bye [preauth]
Sep 30 17:51:22 compute-1 sshd-session[151857]: Disconnected from invalid user sanjay 107.172.146.104 port 45526 [preauth]
Sep 30 17:51:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:22 compute-1 ceph-mon[75484]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:51:22 compute-1 sudo[151860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:51:22 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Sep 30 17:51:22 compute-1 sudo[151860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:22 compute-1 sudo[151860]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:22 compute-1 sudo[151883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:51:22 compute-1 sudo[151883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:22 compute-1 sudo[151883]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:22 compute-1 sudo[151892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 17:51:22 compute-1 sudo[151892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:22 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:23 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:23.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 17:51:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:23.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:23 compute-1 podman[152009]: 2025-09-30 17:51:23.568572607 +0000 UTC m=+0.094912925 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Sep 30 17:51:23 compute-1 podman[152009]: 2025-09-30 17:51:23.691123909 +0000 UTC m=+0.217464237 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 17:51:23 compute-1 podman[152043]: 2025-09-30 17:51:23.860062863 +0000 UTC m=+0.087867515 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 17:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175124 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:24 compute-1 podman[152146]: 2025-09-30 17:51:24.401584145 +0000 UTC m=+0.085921872 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:51:24 compute-1 podman[152146]: 2025-09-30 17:51:24.416142329 +0000 UTC m=+0.100480076 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 17:51:24 compute-1 ceph-mon[75484]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:24 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:24 compute-1 podman[152236]: 2025-09-30 17:51:24.937145936 +0000 UTC m=+0.069835348 container exec f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:51:24 compute-1 podman[152236]: 2025-09-30 17:51:24.951936835 +0000 UTC m=+0.084626207 container exec_died f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 17:51:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:25 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:25 compute-1 podman[152300]: 2025-09-30 17:51:25.23967827 +0000 UTC m=+0.095398599 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:51:25 compute-1 podman[152300]: 2025-09-30 17:51:25.257233954 +0000 UTC m=+0.112954253 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 17:51:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:25.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:25.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:25 compute-1 podman[152365]: 2025-09-30 17:51:25.607979221 +0000 UTC m=+0.089785907 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, release=1793, build-date=2023-02-22T09:23:20, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Sep 30 17:51:25 compute-1 podman[152365]: 2025-09-30 17:51:25.629110422 +0000 UTC m=+0.110917058 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, architecture=x86_64, description=keepalived for Ceph, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, version=2.2.4, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Sep 30 17:51:25 compute-1 sudo[151892]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:26 compute-1 sudo[152432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:51:26 compute-1 sudo[152432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:26 compute-1 sudo[152432]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:26 compute-1 sudo[152457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:51:26 compute-1 sudo[152457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:26 compute-1 ceph-mon[75484]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:26 compute-1 sudo[152457]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:26 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:26 compute-1 kernel: SELinux:  Converting 2771 SID table entries...
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:51:26 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:51:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:27 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:27 compute-1 sshd-session[152098]: Received disconnect from 113.249.93.94 port 1694:11: Bye Bye [preauth]
Sep 30 17:51:27 compute-1 sshd-session[152098]: Disconnected from authenticating user root 113.249.93.94 port 1694 [preauth]
Sep 30 17:51:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:27.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:51:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:27.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:51:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:51:28 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:51:28 compute-1 sshd-session[152522]: Invalid user consulta1 from 14.225.167.110 port 54238
Sep 30 17:51:28 compute-1 sshd-session[152522]: Received disconnect from 14.225.167.110 port 54238:11: Bye Bye [preauth]
Sep 30 17:51:28 compute-1 sshd-session[152522]: Disconnected from invalid user consulta1 14.225.167.110 port 54238 [preauth]
Sep 30 17:51:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:28 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:29 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:29.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:29.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:29 compute-1 ceph-mon[75484]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:30 compute-1 ceph-mon[75484]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:30 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:31 compute-1 sshd-session[152526]: Connection closed by authenticating user root 192.210.160.141 port 37700 [preauth]
Sep 30 17:51:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:31 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:31.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:31.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:32 compute-1 ceph-mon[75484]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:51:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:32 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:33.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:33.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:33 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:51:33 compute-1 sudo[152532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:51:33 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Sep 30 17:51:33 compute-1 sudo[152532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:33 compute-1 sudo[152532]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:34 compute-1 ceph-mon[75484]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:51:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:34 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:51:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:34 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:35 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:35.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:35.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:36 compute-1 sshd-session[152559]: Invalid user user5 from 103.153.190.105 port 55196
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:36 compute-1 sshd[1007]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 143505
Sep 30 17:51:36 compute-1 sshd-session[152559]: Received disconnect from 103.153.190.105 port 55196:11: Bye Bye [preauth]
Sep 30 17:51:36 compute-1 sshd-session[152559]: Disconnected from invalid user user5 103.153.190.105 port 55196 [preauth]
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:36 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:36 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:36 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:51:36 compute-1 ceph-mon[75484]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:36 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:37 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:37.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:37.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:51:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:38 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:39 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:39 compute-1 ceph-mon[75484]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:51:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:39.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:39.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:40 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:51:40 compute-1 sshd-session[152846]: Invalid user user from 175.126.165.170 port 48176
Sep 30 17:51:40 compute-1 ceph-mon[75484]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:40 compute-1 sshd-session[152846]: Received disconnect from 175.126.165.170 port 48176:11: Bye Bye [preauth]
Sep 30 17:51:40 compute-1 sshd-session[152846]: Disconnected from invalid user user 175.126.165.170 port 48176 [preauth]
Sep 30 17:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:40 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:41 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:41.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:41.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:42 compute-1 ceph-mon[75484]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:42 compute-1 sudo[154486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:51:42 compute-1 sudo[154486]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:51:42 compute-1 sudo[154486]: pam_unix(sudo:session): session closed for user root
Sep 30 17:51:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:42 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:43 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:43.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:43.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:44 compute-1 podman[155176]: 2025-09-30 17:51:44.622751608 +0000 UTC m=+0.140065256 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true)
Sep 30 17:51:44 compute-1 ceph-mon[75484]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:51:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:44 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:45 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:45.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:45.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175146 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:46 compute-1 ceph-mon[75484]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:46 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:47 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba3000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:47.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:47.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:47 compute-1 sshd-session[156225]: Invalid user work from 194.107.115.65 port 10730
Sep 30 17:51:47 compute-1 sshd-session[156225]: Received disconnect from 194.107.115.65 port 10730:11: Bye Bye [preauth]
Sep 30 17:51:47 compute-1 sshd-session[156225]: Disconnected from invalid user work 194.107.115.65 port 10730 [preauth]
Sep 30 17:51:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:48 compute-1 ceph-mon[75484]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:48 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:49 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:49.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:49.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:49 compute-1 sshd-session[156779]: Connection closed by authenticating user root 192.210.160.141 port 58784 [preauth]
Sep 30 17:51:50 compute-1 ceph-mon[75484]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:51:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:50 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:51 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:51.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:51.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:52 compute-1 ceph-mon[75484]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:51:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:51:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:52 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:53 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:53.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:53.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:51:54.289 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:51:54.289 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:51:54.289 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:51:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:54 compute-1 ceph-mon[75484]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:51:54 compute-1 podman[159920]: 2025-09-30 17:51:54.57312162 +0000 UTC m=+0.111710610 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 17:51:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:54 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:55 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:55.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:56 compute-1 ceph-mon[75484]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:56 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:51:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:57 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:51:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:57.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:51:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:57.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:51:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:58 compute-1 ceph-mon[75484]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:51:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:58 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:51:59 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:51:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:51:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:51:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:51:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:51:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:51:59.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:51:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:51:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:51:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:51:59.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:00 compute-1 ceph-mon[75484]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:00 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:01 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:01.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:01.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:02 compute-1 ceph-mon[75484]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:02 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:02 compute-1 sudo[163746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:52:02 compute-1 sudo[163746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:02 compute-1 sudo[163746]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:03 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000055s ======
Sep 30 17:52:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:03.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Sep 30 17:52:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:04 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:04 compute-1 ceph-mon[75484]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:52:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:05 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba20004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:05.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:05.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:05 compute-1 ceph-mon[75484]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:06 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:07 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:52:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:07.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:08 compute-1 ceph-mon[75484]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:08 compute-1 sshd-session[165518]: Invalid user deploy from 192.210.160.141 port 58358
Sep 30 17:52:08 compute-1 sshd-session[165518]: Connection closed by invalid user deploy 192.210.160.141 port 58358 [preauth]
Sep 30 17:52:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:08 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:09 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba200040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:09.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:09.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:10 compute-1 ceph-mon[75484]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:10 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba0c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:11 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fba2c004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:11.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:11.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:12 compute-1 ceph-mon[75484]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[150099]: 30/09/2025 17:52:12 : epoch 68dc1869 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9f4002b10 fd 38 proxy ignored for local
Sep 30 17:52:12 compute-1 kernel: ganesha.nfsd[158183]: segfault at 50 ip 00007fbadf17032e sp 00007fba97ffe210 error 4 in libntirpc.so.5.8[7fbadf155000+2c000] likely on CPU 2 (core 0, socket 2)
Sep 30 17:52:12 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:52:12 compute-1 systemd[1]: Started Process Core Dump (PID 168417/UID 0).
Sep 30 17:52:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:13.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:14 compute-1 systemd-coredump[168430]: Process 150104 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 60:
                                                    #0  0x00007fbadf17032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:52:14 compute-1 systemd[1]: systemd-coredump@5-168417-0.service: Deactivated successfully.
Sep 30 17:52:14 compute-1 systemd[1]: systemd-coredump@5-168417-0.service: Consumed 1.571s CPU time.
Sep 30 17:52:14 compute-1 podman[169357]: 2025-09-30 17:52:14.680173078 +0000 UTC m=+0.032229639 container died f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Sep 30 17:52:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-278807b49319efd3aaf6934853ab45e872d95cedac1a1ac57cdf9717c82dad26-merged.mount: Deactivated successfully.
Sep 30 17:52:14 compute-1 podman[169357]: 2025-09-30 17:52:14.896986848 +0000 UTC m=+0.249043369 container remove f1bd5a4ecb15466736615b89c0d9425f916279f1fe7ab0004a2200b396263154 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Sep 30 17:52:14 compute-1 ceph-mon[75484]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:52:14 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:52:15 compute-1 podman[169455]: 2025-09-30 17:52:15.036073569 +0000 UTC m=+0.115299713 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 17:52:15 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:52:15 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.759s CPU time.
Sep 30 17:52:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:15.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:15.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:16 compute-1 sshd-session[169409]: Invalid user embedded from 84.51.43.58 port 56625
Sep 30 17:52:16 compute-1 sshd-session[169409]: Received disconnect from 84.51.43.58 port 56625:11: Bye Bye [preauth]
Sep 30 17:52:16 compute-1 sshd-session[169409]: Disconnected from invalid user embedded 84.51.43.58 port 56625 [preauth]
Sep 30 17:52:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:16 compute-1 sshd-session[169511]: Received disconnect from 107.172.146.104 port 55134:11: Bye Bye [preauth]
Sep 30 17:52:16 compute-1 sshd-session[169511]: Disconnected from authenticating user root 107.172.146.104 port 55134 [preauth]
Sep 30 17:52:16 compute-1 ceph-mon[75484]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:17.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:17.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:18 compute-1 ceph-mon[75484]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175218 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:52:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:19.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:19.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:20 compute-1 ceph-mon[75484]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:21.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:21.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:22 compute-1 ceph-mon[75484]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:52:23 compute-1 sudo[169532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:52:23 compute-1 sudo[169532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:23 compute-1 sudo[169532]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:23.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:23.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:24 compute-1 ceph-mon[75484]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:52:25 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 6.
Sep 30 17:52:25 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:52:25 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.759s CPU time.
Sep 30 17:52:25 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:52:25 compute-1 podman[169559]: 2025-09-30 17:52:25.317674533 +0000 UTC m=+0.103695167 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:25 compute-1 podman[169620]: 2025-09-30 17:52:25.436173483 +0000 UTC m=+0.034427499 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:52:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:25.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:25 compute-1 podman[169620]: 2025-09-30 17:52:25.663307424 +0000 UTC m=+0.261561380 container create a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:52:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edcdb108f17847ac15e2b2053673846ab0f8770411cf45373d92d5f2034fede5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:52:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edcdb108f17847ac15e2b2053673846ab0f8770411cf45373d92d5f2034fede5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:52:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edcdb108f17847ac15e2b2053673846ab0f8770411cf45373d92d5f2034fede5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:52:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edcdb108f17847ac15e2b2053673846ab0f8770411cf45373d92d5f2034fede5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:52:25 compute-1 podman[169620]: 2025-09-30 17:52:25.762179059 +0000 UTC m=+0.360433065 container init a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:52:25 compute-1 podman[169620]: 2025-09-30 17:52:25.770220189 +0000 UTC m=+0.368474155 container start a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Sep 30 17:52:25 compute-1 bash[169620]: a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c
Sep 30 17:52:25 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:52:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:26 compute-1 ceph-mon[75484]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:52:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:27.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:27.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:28 compute-1 sshd-session[169680]: Connection closed by authenticating user root 192.210.160.141 port 56952 [preauth]
Sep 30 17:52:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:29.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:29.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:31.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:31 compute-1 sshd-session[169691]: Invalid user vas from 167.172.43.167 port 40750
Sep 30 17:52:32 compute-1 sshd-session[169691]: Received disconnect from 167.172.43.167 port 40750:11: Bye Bye [preauth]
Sep 30 17:52:32 compute-1 sshd-session[169691]: Disconnected from invalid user vas 167.172.43.167 port 40750 [preauth]
Sep 30 17:52:32 compute-1 ceph-mon[75484]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:52:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:33 compute-1 kernel: SELinux:  Converting 2772 SID table entries...
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 17:52:33 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 17:52:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:33.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:34 compute-1 sudo[169699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:52:34 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Sep 30 17:52:34 compute-1 sudo[169699]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:34 compute-1 sudo[169699]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:34 compute-1 ceph-mon[75484]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:52:34 compute-1 sudo[169724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:52:34 compute-1 sudo[169724]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:34 compute-1 sudo[169724]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:35 compute-1 ceph-mon[75484]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:52:35 compute-1 ceph-mon[75484]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:35.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:35 compute-1 sshd[1007]: drop connection #0 from [110.42.70.108]:34356 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:35 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:35 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:52:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:35.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:36 compute-1 ceph-mon[75484]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:52:36 compute-1 groupadd[169790]: group added to /etc/group: name=dnsmasq, GID=992
Sep 30 17:52:36 compute-1 groupadd[169790]: group added to /etc/gshadow: name=dnsmasq
Sep 30 17:52:36 compute-1 groupadd[169790]: new group: name=dnsmasq, GID=992
Sep 30 17:52:36 compute-1 useradd[169797]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Sep 30 17:52:36 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:52:36 compute-1 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Sep 30 17:52:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:37 compute-1 groupadd[169811]: group added to /etc/group: name=clevis, GID=991
Sep 30 17:52:37 compute-1 groupadd[169811]: group added to /etc/gshadow: name=clevis
Sep 30 17:52:37 compute-1 groupadd[169811]: new group: name=clevis, GID=991
Sep 30 17:52:37 compute-1 useradd[169818]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Sep 30 17:52:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:37.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:37.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:37 compute-1 usermod[169828]: add 'clevis' to group 'tss'
Sep 30 17:52:37 compute-1 usermod[169828]: add 'clevis' to shadow group 'tss'
Sep 30 17:52:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-mon[75484]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:52:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:39.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:39.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:40 compute-1 ceph-mon[75484]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:52:41 compute-1 polkitd[6874]: Reloading rules
Sep 30 17:52:41 compute-1 polkitd[6874]: Collecting garbage unconditionally...
Sep 30 17:52:41 compute-1 polkitd[6874]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 17:52:41 compute-1 polkitd[6874]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 17:52:41 compute-1 polkitd[6874]: Finished loading, compiling and executing 4 rules
Sep 30 17:52:41 compute-1 polkitd[6874]: Reloading rules
Sep 30 17:52:41 compute-1 polkitd[6874]: Collecting garbage unconditionally...
Sep 30 17:52:41 compute-1 polkitd[6874]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 17:52:41 compute-1 polkitd[6874]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 17:52:41 compute-1 polkitd[6874]: Finished loading, compiling and executing 4 rules
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:52:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:41.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:41.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:42 compute-1 ceph-mon[75484]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Sep 30 17:52:42 compute-1 sudo[169992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:52:42 compute-1 sudo[169992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:42 compute-1 sudo[169992]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:42 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ca0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:43 compute-1 sudo[170042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:52:43 compute-1 sudo[170042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:52:43 compute-1 sudo[170042]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:43 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:43 compute-1 groupadd[170085]: group added to /etc/group: name=ceph, GID=167
Sep 30 17:52:43 compute-1 groupadd[170085]: group added to /etc/gshadow: name=ceph
Sep 30 17:52:43 compute-1 groupadd[170085]: new group: name=ceph, GID=167
Sep 30 17:52:43 compute-1 useradd[170091]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Sep 30 17:52:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:43.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:43.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:44 compute-1 ceph-mon[75484]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175244 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:44 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:45 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:45.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:45 compute-1 podman[170104]: 2025-09-30 17:52:45.523730396 +0000 UTC m=+0.139350359 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 17:52:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:52:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:45.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:52:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:46 compute-1 sshd-session[170100]: Invalid user gits from 175.126.165.170 port 39750
Sep 30 17:52:46 compute-1 sshd-session[170100]: Received disconnect from 175.126.165.170 port 39750:11: Bye Bye [preauth]
Sep 30 17:52:46 compute-1 sshd-session[170100]: Disconnected from invalid user gits 175.126.165.170 port 39750 [preauth]
Sep 30 17:52:46 compute-1 ceph-mon[75484]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:52:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:46 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:47 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:47 compute-1 sshd[1007]: Received signal 15; terminating.
Sep 30 17:52:47 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Sep 30 17:52:47 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Sep 30 17:52:47 compute-1 systemd[1]: sshd.service: Unit process 170616 (sshd-session) remains running after unit stopped.
Sep 30 17:52:47 compute-1 systemd[1]: sshd.service: Unit process 170784 (sshd-session) remains running after unit stopped.
Sep 30 17:52:47 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Sep 30 17:52:47 compute-1 systemd[1]: sshd.service: Consumed 12.226s CPU time, 35.8M memory peak, read 0B from disk, written 132.0K to disk.
Sep 30 17:52:47 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Sep 30 17:52:47 compute-1 systemd[1]: Stopping sshd-keygen.target...
Sep 30 17:52:47 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 17:52:47 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 17:52:47 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 17:52:47 compute-1 systemd[1]: Reached target sshd-keygen.target.
Sep 30 17:52:47 compute-1 systemd[1]: Starting OpenSSH server daemon...
Sep 30 17:52:47 compute-1 sshd[170789]: Server listening on 0.0.0.0 port 22.
Sep 30 17:52:47 compute-1 sshd[170789]: Server listening on :: port 22.
Sep 30 17:52:47 compute-1 systemd[1]: Started OpenSSH server daemon.
Sep 30 17:52:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:47.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:47.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:48 compute-1 sshd-session[170616]: Connection closed by authenticating user root 192.210.160.141 port 60178 [preauth]
Sep 30 17:52:48 compute-1 ceph-mon[75484]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:52:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:48 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:49 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c980023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:49.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:49.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:49 compute-1 ceph-mon[75484]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:52:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:52:50 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:52:50 compute-1 systemd[1]: Reloading.
Sep 30 17:52:50 compute-1 systemd-sysv-generator[171055]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:52:50 compute-1 systemd-rc-local-generator[171052]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:52:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:50 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:52:50 compute-1 sshd-session[170989]: Invalid user usr from 194.107.115.65 port 35196
Sep 30 17:52:50 compute-1 sshd-session[170989]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:52:50 compute-1 sshd-session[170989]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 17:52:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:50 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:51 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:51.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:52 compute-1 ceph-mon[75484]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:52:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:52 compute-1 sshd-session[170989]: Failed password for invalid user usr from 194.107.115.65 port 35196 ssh2
Sep 30 17:52:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:52 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:53 compute-1 systemd[1]: Starting PackageKit Daemon...
Sep 30 17:52:53 compute-1 PackageKit[173819]: daemon start
Sep 30 17:52:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:53 compute-1 systemd[1]: Started PackageKit Daemon.
Sep 30 17:52:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:53 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c980023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:52:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:53.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:53 compute-1 sudo[151467]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:52:54.290 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:52:54.291 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:52:54.291 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:52:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:54 compute-1 ceph-mon[75484]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:52:54 compute-1 sshd-session[170989]: Received disconnect from 194.107.115.65 port 35196:11: Bye Bye [preauth]
Sep 30 17:52:54 compute-1 sshd-session[170989]: Disconnected from invalid user usr 194.107.115.65 port 35196 [preauth]
Sep 30 17:52:54 compute-1 sudo[175349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odcjaazwizczbfjdyhfavoqbncrhoamu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254774.1573641-653-248214087458770/AnsiballZ_systemd.py'
Sep 30 17:52:54 compute-1 sudo[175349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:52:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:54 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:55 compute-1 python3.9[175373]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:52:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:55 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:55.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:55 compute-1 podman[175745]: 2025-09-30 17:52:55.557437013 +0000 UTC m=+0.092291787 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:52:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:52:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:55.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:52:56 compute-1 systemd[1]: Reloading.
Sep 30 17:52:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:56 compute-1 systemd-rc-local-generator[176411]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:52:56 compute-1 systemd-sysv-generator[176426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:52:56 compute-1 sudo[175349]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:56 compute-1 ceph-mon[75484]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:56 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:57 compute-1 sudo[177125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfwtrdsmqxmgihtjpnzproemrearziff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254776.8136535-653-18670410702492/AnsiballZ_systemd.py'
Sep 30 17:52:57 compute-1 sudo[177125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:52:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:57 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c980023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:57 compute-1 python3.9[177148]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:52:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:57.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:57 compute-1 systemd[1]: Reloading.
Sep 30 17:52:57 compute-1 systemd-rc-local-generator[177500]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:52:57 compute-1 systemd-sysv-generator[177503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:52:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:57.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:57 compute-1 sudo[177125]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:52:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:58 compute-1 sudo[178272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htknfgbunlgfrgduvfursmcehvgzjbft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254778.0563927-653-17932013815647/AnsiballZ_systemd.py'
Sep 30 17:52:58 compute-1 sudo[178272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:52:58 compute-1 python3.9[178297]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:52:58 compute-1 systemd[1]: Reloading.
Sep 30 17:52:58 compute-1 ceph-mon[75484]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:52:58 compute-1 systemd-rc-local-generator[178668]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:52:58 compute-1 systemd-sysv-generator[178671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:52:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:58 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:59 compute-1 sudo[178272]: pam_unix(sudo:session): session closed for user root
Sep 30 17:52:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:52:59 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:52:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:52:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:52:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:52:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:52:59.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:59 compute-1 sudo[179371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wofivrhecqpaxxfltjqoopokrsbnhqjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254779.3128433-653-186938608581512/AnsiballZ_systemd.py'
Sep 30 17:52:59 compute-1 sudo[179371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:52:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:52:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:52:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:52:59.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:52:59 compute-1 python3.9[179400]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:53:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:00 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:53:00 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:53:00 compute-1 systemd[1]: man-db-cache-update.service: Consumed 13.754s CPU time.
Sep 30 17:53:00 compute-1 systemd[1]: run-r46238e27c08b4378adee69d2b0d02bef.service: Deactivated successfully.
Sep 30 17:53:00 compute-1 ceph-mon[75484]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:00 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:01 compute-1 systemd[1]: Reloading.
Sep 30 17:53:01 compute-1 systemd-rc-local-generator[180083]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:01 compute-1 systemd-sysv-generator[180086]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:01 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:01 compute-1 sudo[179371]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:01.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:01.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:01 compute-1 sudo[180242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zouvzzgqyzfpalxvviwhgdpwtpenqhoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254781.6183424-711-151452052636604/AnsiballZ_systemd.py'
Sep 30 17:53:01 compute-1 sudo[180242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:02 compute-1 python3.9[180244]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:02 compute-1 systemd[1]: Reloading.
Sep 30 17:53:02 compute-1 systemd-sysv-generator[180276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:02 compute-1 systemd-rc-local-generator[180272]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:02 compute-1 sudo[180242]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:02 compute-1 ceph-mon[75484]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:02 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:03 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c980034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:03 compute-1 sudo[180454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjalehrzdkyqwebitlbxcfhwhkhxfrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254782.9188585-711-68175068341764/AnsiballZ_systemd.py'
Sep 30 17:53:03 compute-1 sudo[180454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:03 compute-1 sudo[180416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:53:03 compute-1 sudo[180416]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:03 compute-1 sudo[180416]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:03.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:03 compute-1 python3.9[180459]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:03 compute-1 systemd[1]: Reloading.
Sep 30 17:53:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:03.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:03 compute-1 systemd-sysv-generator[180495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:03 compute-1 systemd-rc-local-generator[180492]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:03 compute-1 ceph-mon[75484]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:53:04 compute-1 sudo[180454]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:04 compute-1 auditd[704]: Audit daemon rotating log files
Sep 30 17:53:04 compute-1 sudo[180650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okjekltugyqpkyuvimyhlqioqsmphfpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254784.242415-711-99860358416563/AnsiballZ_systemd.py'
Sep 30 17:53:04 compute-1 sudo[180650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:04 compute-1 python3.9[180653]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:04 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:05 compute-1 systemd[1]: Reloading.
Sep 30 17:53:05 compute-1 systemd-rc-local-generator[180679]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:05 compute-1 systemd-sysv-generator[180685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:05 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800030d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:05 compute-1 sudo[180650]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:05.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:05.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:05 compute-1 sudo[180842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trtuodtxewllynsxnqjmxifcpzxzvfpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254785.6023781-711-236670891804252/AnsiballZ_systemd.py'
Sep 30 17:53:05 compute-1 sudo[180842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:06 compute-1 python3.9[180844]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:06 compute-1 sudo[180842]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:06 compute-1 ceph-mon[75484]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:06 compute-1 sudo[180999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxxsuwdpuccgrvnlvhvdclqaxfkxycbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254786.563999-711-217945856861530/AnsiballZ_systemd.py'
Sep 30 17:53:06 compute-1 sudo[180999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:06 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:07 compute-1 python3.9[181001]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:07 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:07 compute-1 sshd-session[175842]: ssh_dispatch_run_fatal: Connection from 101.126.25.120 port 48748: Connection timed out [preauth]
Sep 30 17:53:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:07.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:53:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:07.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:08 compute-1 unix_chkpwd[181007]: password check failed for user (root)
Sep 30 17:53:08 compute-1 sshd-session[180902]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:53:08 compute-1 systemd[1]: Reloading.
Sep 30 17:53:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:08 compute-1 systemd-sysv-generator[181036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:08 compute-1 systemd-rc-local-generator[181030]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:08 compute-1 ceph-mon[75484]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:08 compute-1 sudo[180999]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:08 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:09 compute-1 sudo[181193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evyahflxhgvxrydqqhkbidpjlvffqefb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254788.8947184-783-28903598280246/AnsiballZ_systemd.py'
Sep 30 17:53:09 compute-1 sudo[181193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:09 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:09.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:09 compute-1 python3.9[181195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 17:53:09 compute-1 systemd[1]: Reloading.
Sep 30 17:53:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:09.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:09 compute-1 systemd-sysv-generator[181229]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:53:09 compute-1 systemd-rc-local-generator[181226]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:53:10 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Sep 30 17:53:10 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Sep 30 17:53:10 compute-1 sudo[181193]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:10 compute-1 sshd-session[180902]: Failed password for root from 192.210.160.141 port 56884 ssh2
Sep 30 17:53:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:10 compute-1 sshd-session[181262]: Invalid user 24online from 107.172.146.104 port 47886
Sep 30 17:53:10 compute-1 sshd-session[181262]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:53:10 compute-1 sshd-session[181262]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:53:10 compute-1 ceph-mon[75484]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:10 compute-1 sudo[181390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbghgasgnxswjrsnzrlvhxozenaeovmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254790.334379-799-116744674129998/AnsiballZ_systemd.py'
Sep 30 17:53:10 compute-1 sudo[181390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:10 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:11 compute-1 python3.9[181392]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:11 compute-1 sudo[181390]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:11 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:11 compute-1 sshd-session[180902]: Connection closed by authenticating user root 192.210.160.141 port 56884 [preauth]
Sep 30 17:53:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:11.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:11 compute-1 sudo[181545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudwdomlrejjmsgipgvrwfjemcmoovko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254791.2634726-799-248304590747638/AnsiballZ_systemd.py'
Sep 30 17:53:11 compute-1 sudo[181545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:11.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:11 compute-1 python3.9[181547]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:12 compute-1 sudo[181545]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:12 compute-1 sshd-session[181262]: Failed password for invalid user 24online from 107.172.146.104 port 47886 ssh2
Sep 30 17:53:12 compute-1 sudo[181702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbmdxoyebihhoqemczulywjmfddlvtnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254792.1477337-799-197909536140378/AnsiballZ_systemd.py'
Sep 30 17:53:12 compute-1 sudo[181702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:12 compute-1 ceph-mon[75484]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:12 compute-1 python3.9[181704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:12 compute-1 sudo[181702]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:12 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:12 compute-1 sshd-session[181262]: Received disconnect from 107.172.146.104 port 47886:11: Bye Bye [preauth]
Sep 30 17:53:12 compute-1 sshd-session[181262]: Disconnected from invalid user 24online 107.172.146.104 port 47886 [preauth]
Sep 30 17:53:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:13 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:13 compute-1 sudo[181858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzrabiyibfuqeuiwumlppmwgkfhzraao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254792.9861703-799-165336233002089/AnsiballZ_systemd.py'
Sep 30 17:53:13 compute-1 sudo[181858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:13.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:13 compute-1 python3.9[181860]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:13 compute-1 sudo[181858]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:13.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:14 compute-1 sudo[182014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbooesdlcxecklpvtkzckkopzecukxnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254793.8534684-799-162229433238835/AnsiballZ_systemd.py'
Sep 30 17:53:14 compute-1 sudo[182014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:14 compute-1 python3.9[182016]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:14 compute-1 sudo[182014]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:14 compute-1 ceph-mon[75484]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:53:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:14 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:15 compute-1 sudo[182170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbovlsmjawuxnhthmkdhzuogmzxxxrfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254794.7125664-799-258321201359394/AnsiballZ_systemd.py'
Sep 30 17:53:15 compute-1 sudo[182170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:15 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:15 compute-1 python3.9[182172]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:15 compute-1 sudo[182170]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:15.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:15.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:15 compute-1 sudo[182337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqtfybnicrblacjaiinrkdvjebdnptai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254795.5738175-799-131729296086884/AnsiballZ_systemd.py'
Sep 30 17:53:15 compute-1 sudo[182337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:16 compute-1 podman[182299]: 2025-09-30 17:53:16.05087741 +0000 UTC m=+0.133755137 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 17:53:16 compute-1 python3.9[182342]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:16 compute-1 sudo[182337]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:16 compute-1 ceph-mon[75484]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:16 compute-1 sudo[182506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjubihmehlgetskstmzbmvnlezrbdem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254796.5374415-799-229074723923748/AnsiballZ_systemd.py'
Sep 30 17:53:16 compute-1 sudo[182506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:16 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:17 compute-1 python3.9[182508]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:17 compute-1 sudo[182506]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:17 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:17.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:17.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:17 compute-1 sudo[182661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbcfkmcyisuebznmckwaetecfqmvyhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254797.3720884-799-45022615257355/AnsiballZ_systemd.py'
Sep 30 17:53:17 compute-1 sudo[182661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:18 compute-1 python3.9[182663]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:18 compute-1 sudo[182661]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:18 compute-1 sudo[182818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlwkmqwhhfhzavzjvbapufpfleifqtjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254798.3512568-799-45350437086382/AnsiballZ_systemd.py'
Sep 30 17:53:18 compute-1 sudo[182818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:18 compute-1 ceph-mon[75484]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:18 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:19 compute-1 python3.9[182820]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:19 compute-1 sudo[182818]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:19 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:19.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:19 compute-1 sudo[182973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kljlrukgzrvitnqnbwwvxiqsmqfjsbaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254799.2639287-799-132499613332527/AnsiballZ_systemd.py'
Sep 30 17:53:19 compute-1 sudo[182973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:19.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:19 compute-1 python3.9[182975]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:20 compute-1 sudo[182973]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:20 compute-1 sudo[183129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beeqcidptfglagcpopsnsxmyvafqffiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254800.1616912-799-86637258309230/AnsiballZ_systemd.py'
Sep 30 17:53:20 compute-1 sudo[183129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:20 compute-1 ceph-mon[75484]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:20 compute-1 python3.9[183131]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:20 compute-1 sudo[183129]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:20 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:21 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:21 compute-1 sudo[183285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzkuyitrvdxohjjpwskvtweqiocqedzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254800.9867969-799-192478658248595/AnsiballZ_systemd.py'
Sep 30 17:53:21 compute-1 sudo[183285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:21.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:21 compute-1 python3.9[183287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:21.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:21 compute-1 sudo[183285]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:22 compute-1 sudo[183441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-libdxuxsyrlywcsayynspcuowvlwpugb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254801.8814063-799-239857607684507/AnsiballZ_systemd.py'
Sep 30 17:53:22 compute-1 sudo[183441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:22 compute-1 python3.9[183443]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 17:53:22 compute-1 sudo[183441]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:22 compute-1 sshd-session[181675]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:53:22 compute-1 sshd-session[181675]: banner exchange: Connection from 14.103.129.43 port 38864: Connection timed out
Sep 30 17:53:22 compute-1 ceph-mon[75484]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:53:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:22 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:23 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:23 compute-1 sudo[183575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:53:23 compute-1 sudo[183575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:23 compute-1 sudo[183620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggkpuukbtumjdzeyvrordmohwkafbemo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254803.0359223-1003-207813231090126/AnsiballZ_file.py'
Sep 30 17:53:23 compute-1 sudo[183575]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:23 compute-1 sudo[183620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:23 compute-1 python3.9[183624]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:23.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:23 compute-1 sudo[183620]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:23.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:24 compute-1 sudo[183775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maeakquunpcmkswnwpxtnmthguuwuljo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254803.7595794-1003-244254301422213/AnsiballZ_file.py'
Sep 30 17:53:24 compute-1 sudo[183775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:24 compute-1 python3.9[183777]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:24 compute-1 sudo[183775]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:24 compute-1 ceph-mon[75484]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:53:24 compute-1 sudo[183928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbreaqotipajzhtwjnqmtojwyhvcptvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254804.4555256-1003-138761519892374/AnsiballZ_file.py'
Sep 30 17:53:24 compute-1 sudo[183928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:24 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:25 compute-1 python3.9[183930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:25 compute-1 sudo[183928]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:25 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:25 compute-1 sudo[184080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkshrffaofkixsrgicdlvtrldovzxsio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254805.203854-1003-106467152845687/AnsiballZ_file.py'
Sep 30 17:53:25 compute-1 sudo[184080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:25.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:25.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:25 compute-1 python3.9[184082]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:25 compute-1 sudo[184080]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:26 compute-1 sudo[184249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjppkdmvpguicwwzcrbgjwdglleqpkem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254805.9414923-1003-267653734386210/AnsiballZ_file.py'
Sep 30 17:53:26 compute-1 sudo[184249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:26 compute-1 podman[184207]: 2025-09-30 17:53:26.288509365 +0000 UTC m=+0.092723789 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 17:53:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:26 compute-1 python3.9[184253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:26 compute-1 sudo[184249]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:26 compute-1 ceph-mon[75484]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:26 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:27 compute-1 sudo[184406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dejndczczrwfdbmxuzdjgoruzseguwld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254806.6462317-1003-33626377552999/AnsiballZ_file.py'
Sep 30 17:53:27 compute-1 sudo[184406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:27 compute-1 python3.9[184408]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:53:27 compute-1 sudo[184406]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:27 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:27 compute-1 unix_chkpwd[184433]: password check failed for user (root)
Sep 30 17:53:27 compute-1 sshd-session[184254]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 17:53:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:27.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:27.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:27 compute-1 sudo[184560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acgntmbgzpnteajsergdowvwrxjaoozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254807.4725022-1089-94206552732261/AnsiballZ_stat.py'
Sep 30 17:53:27 compute-1 sudo[184560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:28 compute-1 python3.9[184562]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:28 compute-1 sudo[184560]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:28 compute-1 ceph-mon[75484]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:28 compute-1 sudo[184686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvygpgjvkytnsaniejjkvbswufgcyxez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254807.4725022-1089-94206552732261/AnsiballZ_copy.py'
Sep 30 17:53:28 compute-1 sudo[184686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:28 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:29 compute-1 python3.9[184688]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254807.4725022-1089-94206552732261/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:29 compute-1 sudo[184686]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:29 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:29 compute-1 sudo[184839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaorvdsurtpzfaylsgwqylsnktqpnbby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254809.2009661-1089-136907332955428/AnsiballZ_stat.py'
Sep 30 17:53:29 compute-1 sudo[184839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:29.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:29 compute-1 sshd-session[184254]: Failed password for root from 84.51.43.58 port 38432 ssh2
Sep 30 17:53:29 compute-1 python3.9[184841]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:29.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:29 compute-1 sudo[184839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:30 compute-1 sudo[184966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulifkctwjrmmgnzgbxkwqiuidlhmzbwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254809.2009661-1089-136907332955428/AnsiballZ_copy.py'
Sep 30 17:53:30 compute-1 sudo[184966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:30 compute-1 sshd-session[184254]: Received disconnect from 84.51.43.58 port 38432:11: Bye Bye [preauth]
Sep 30 17:53:30 compute-1 sshd-session[184254]: Disconnected from authenticating user root 84.51.43.58 port 38432 [preauth]
Sep 30 17:53:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:30 compute-1 python3.9[184968]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254809.2009661-1089-136907332955428/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:30 compute-1 sudo[184966]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:30 compute-1 ceph-mon[75484]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:30 compute-1 sudo[185119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxekmkjzzrmomjwqmvxfdxtvzanyljpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254810.6005335-1089-202232128836563/AnsiballZ_stat.py'
Sep 30 17:53:30 compute-1 sudo[185119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:30 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:31 compute-1 python3.9[185121]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:31 compute-1 sudo[185119]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:31 compute-1 unix_chkpwd[185126]: password check failed for user (root)
Sep 30 17:53:31 compute-1 sshd-session[184762]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:53:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:31 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:31.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:31 compute-1 sudo[185247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-powgjuvcnnsdkddilauupqanmzqsnbvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254810.6005335-1089-202232128836563/AnsiballZ_copy.py'
Sep 30 17:53:31 compute-1 sudo[185247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:31.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:31 compute-1 python3.9[185249]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254810.6005335-1089-202232128836563/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:31 compute-1 sudo[185247]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:32 compute-1 sudo[185401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrmyofyhothkwmmehilxohdiqoujzmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254812.0717816-1089-133793503774527/AnsiballZ_stat.py'
Sep 30 17:53:32 compute-1 sudo[185401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:32 compute-1 python3.9[185403]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:32 compute-1 sudo[185401]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:32 compute-1 ceph-mon[75484]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:32 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:33 compute-1 sshd-session[184762]: Failed password for root from 192.210.160.141 port 35344 ssh2
Sep 30 17:53:33 compute-1 sudo[185527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbseyhukizftiewvtwwpkrvaaavmdmbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254812.0717816-1089-133793503774527/AnsiballZ_copy.py'
Sep 30 17:53:33 compute-1 sudo[185527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:33 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:33 compute-1 python3.9[185529]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254812.0717816-1089-133793503774527/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:33 compute-1 sudo[185527]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:33.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:33.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:33 compute-1 sudo[185679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjcsmxgbknbhifvkvyfseutmqsqjcvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254813.488664-1089-36756214502081/AnsiballZ_stat.py'
Sep 30 17:53:33 compute-1 sudo[185679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:34 compute-1 python3.9[185681]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:34 compute-1 sudo[185679]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:34 compute-1 sshd-session[184762]: Connection closed by authenticating user root 192.210.160.141 port 35344 [preauth]
Sep 30 17:53:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:34 compute-1 sudo[185805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwwtxeijbocqivovybxwyrofdnwsldew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254813.488664-1089-36756214502081/AnsiballZ_copy.py'
Sep 30 17:53:34 compute-1 sudo[185805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:34 compute-1 python3.9[185807]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254813.488664-1089-36756214502081/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:34 compute-1 sudo[185805]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:34 compute-1 ceph-mon[75484]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:53:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:34 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:35 compute-1 sudo[185958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stkrabaubiqacifabbssmrlxhrzaqzyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254814.901519-1089-81293288700091/AnsiballZ_stat.py'
Sep 30 17:53:35 compute-1 sudo[185958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:35 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:35 compute-1 python3.9[185960]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:35 compute-1 sudo[185958]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:35.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:53:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:35.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:53:35 compute-1 sudo[186083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zulncsodtdggnstntrsvtudmxhbveqak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254814.901519-1089-81293288700091/AnsiballZ_copy.py'
Sep 30 17:53:35 compute-1 sudo[186083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:36 compute-1 python3.9[186085]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254814.901519-1089-81293288700091/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:36 compute-1 sudo[186083]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:36 compute-1 sudo[186236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmicwjvccdfvypbirdhzgmzttbeiuyak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254816.278581-1089-13856343757672/AnsiballZ_stat.py'
Sep 30 17:53:36 compute-1 sudo[186236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:36 compute-1 python3.9[186238]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:36 compute-1 sudo[186236]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:36 compute-1 ceph-mon[75484]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:36 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:37 compute-1 sudo[186360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldfrbpfijvkfnhgpjhynwfgrjrseuiav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254816.278581-1089-13856343757672/AnsiballZ_copy.py'
Sep 30 17:53:37 compute-1 sudo[186360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:37 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:37 compute-1 python3.9[186362]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254816.278581-1089-13856343757672/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:37 compute-1 sudo[186360]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:37.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:37.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:53:38 compute-1 sudo[186513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnekyigydiomndgrhbyxrnmkbsqstof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254817.6472862-1089-223051530348483/AnsiballZ_stat.py'
Sep 30 17:53:38 compute-1 sudo[186513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:38 compute-1 python3.9[186515]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:38 compute-1 sudo[186513]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:38 compute-1 sudo[186639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llbjsgbxfmwozflrhzciyczqsgqxafqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254817.6472862-1089-223051530348483/AnsiballZ_copy.py'
Sep 30 17:53:38 compute-1 sudo[186639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:38 compute-1 ceph-mon[75484]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:38 compute-1 python3.9[186641]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759254817.6472862-1089-223051530348483/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:38 compute-1 sudo[186639]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:38 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:39 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:39 compute-1 sudo[186791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdvdmtkkiydaprsowsvhznqklphbdtvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254819.125768-1315-94829410335691/AnsiballZ_command.py'
Sep 30 17:53:39 compute-1 sudo[186791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:39.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:39 compute-1 python3.9[186793]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Sep 30 17:53:39 compute-1 sudo[186791]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:39.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:40 compute-1 sudo[186945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcsjirrrmahevnruaywuwukhkpscgxuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254819.883242-1333-216616705143136/AnsiballZ_file.py'
Sep 30 17:53:40 compute-1 sudo[186945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:40 compute-1 python3.9[186947]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:40 compute-1 sudo[186945]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:40 compute-1 ceph-mon[75484]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:40 compute-1 sudo[187098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgvwibciteacftrhswwyxuhqayjnlxra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254820.6343894-1333-35788679857320/AnsiballZ_file.py'
Sep 30 17:53:40 compute-1 sudo[187098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:41 compute-1 python3.9[187100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:41 compute-1 sudo[187098]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:41 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:41.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:41 compute-1 sudo[187250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zptkemxhicvduzwaumauaeuapyhokiuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254821.3743155-1333-27483102161243/AnsiballZ_file.py'
Sep 30 17:53:41 compute-1 sudo[187250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:41.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:41 compute-1 python3.9[187252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:41 compute-1 sudo[187250]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:42 compute-1 sudo[187403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klpkbthcfkduetxowjwukypnujsxypob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254822.1043005-1333-27213050240513/AnsiballZ_file.py'
Sep 30 17:53:42 compute-1 sudo[187403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:42 compute-1 python3.9[187405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:42 compute-1 sudo[187403]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:42 compute-1 sshd-session[185327]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:53:42 compute-1 sshd-session[185327]: banner exchange: Connection from 101.126.25.120 port 42940: Connection timed out
Sep 30 17:53:42 compute-1 ceph-mon[75484]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:43 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:43 compute-1 sudo[187501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:53:43 compute-1 sudo[187501]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:43 compute-1 sudo[187501]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:43 compute-1 sudo[187539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:53:43 compute-1 sudo[187539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:43 compute-1 sudo[187606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvlrezlvsguksbngoddneekkjdotawbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254822.8178782-1333-254315833857294/AnsiballZ_file.py'
Sep 30 17:53:43 compute-1 sudo[187606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:43 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:43 compute-1 python3.9[187608]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:43 compute-1 sudo[187606]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:43 compute-1 sudo[187625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:53:43 compute-1 sudo[187625]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:43 compute-1 sudo[187625]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:43.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:43 compute-1 sudo[187539]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:43.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:43 compute-1 sudo[187816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpkljjyskqrwjfyrhtabshmjoxasnsgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254823.519261-1333-78396536578379/AnsiballZ_file.py'
Sep 30 17:53:43 compute-1 sudo[187816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:53:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:53:44 compute-1 python3.9[187818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:44 compute-1 sudo[187816]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:44 compute-1 sudo[187969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puvidvszflfvncipxkwbymqhvgozhldn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254824.2459352-1333-231978239566136/AnsiballZ_file.py'
Sep 30 17:53:44 compute-1 sudo[187969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:44 compute-1 python3.9[187971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:44 compute-1 sudo[187969]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:44 compute-1 ceph-mon[75484]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:45 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:45 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800039f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:45 compute-1 sudo[188122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawucqyqrmyhljmgttpmawhiszjnrrtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254824.9958944-1333-224132542012801/AnsiballZ_file.py'
Sep 30 17:53:45 compute-1 sudo[188122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:45 compute-1 python3.9[188124]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:45 compute-1 sudo[188122]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:45.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:45.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:46 compute-1 ceph-mon[75484]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:46 compute-1 sudo[188275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swjndsmszjvoogzxcwxdvhkeuwvvcjxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254825.7523687-1333-145006689404427/AnsiballZ_file.py'
Sep 30 17:53:46 compute-1 sudo[188275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:46 compute-1 podman[188277]: 2025-09-30 17:53:46.301173116 +0000 UTC m=+0.144564632 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 17:53:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:46 compute-1 python3.9[188278]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:46 compute-1 sudo[188275]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:46 compute-1 sudo[188455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trsctedayzrjcpikobsvqwwpkffnztss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254826.5531683-1333-49240016611990/AnsiballZ_file.py'
Sep 30 17:53:46 compute-1 sudo[188455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:47 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:47 compute-1 python3.9[188457]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:47 compute-1 sudo[188455]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:47 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c7c0010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:47 compute-1 sudo[188607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjdxcybuiaujoihlxhhbvlgvxsxblrcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254827.247573-1333-223914883973656/AnsiballZ_file.py'
Sep 30 17:53:47 compute-1 sudo[188607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:47.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:47.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:47 compute-1 python3.9[188609]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:47 compute-1 sudo[188607]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:48 compute-1 sudo[188762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdomufigdkmpctjirsfbpcguqonzflxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254827.9525952-1333-177879524308967/AnsiballZ_file.py'
Sep 30 17:53:48 compute-1 sudo[188762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:48 compute-1 python3.9[188764]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:48 compute-1 sudo[188762]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:48 compute-1 ceph-mon[75484]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:53:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:53:48 compute-1 sudo[188765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:53:48 compute-1 sudo[188765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:53:48 compute-1 sudo[188765]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:48 compute-1 sshd-session[188610]: Invalid user scsadmin from 175.126.165.170 port 38062
Sep 30 17:53:48 compute-1 sshd-session[188610]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:53:48 compute-1 sshd-session[188610]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 17:53:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[169636]: 30/09/2025 17:53:49 : epoch 68dc18d9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c98003e00 fd 38 proxy ignored for local
Sep 30 17:53:49 compute-1 kernel: ganesha.nfsd[169859]: segfault at 50 ip 00007f5d4c12432e sp 00007f5d1affc210 error 4 in libntirpc.so.5.8[7f5d4c109000+2c000] likely on CPU 2 (core 0, socket 2)
Sep 30 17:53:49 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:53:49 compute-1 systemd[1]: Started Process Core Dump (PID 188941/UID 0).
Sep 30 17:53:49 compute-1 sudo[188940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyssqvwausktokjskihfqlqcyqwwwqlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254828.752943-1333-126651065635623/AnsiballZ_file.py'
Sep 30 17:53:49 compute-1 sudo[188940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:49 compute-1 python3.9[188944]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:49 compute-1 sudo[188940]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:49.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:49.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:49 compute-1 sudo[189094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsernhxkudwviecpqgxobqjcwdggmuqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254829.452249-1333-45508207536208/AnsiballZ_file.py'
Sep 30 17:53:49 compute-1 sudo[189094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:49 compute-1 python3.9[189096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:49 compute-1 sudo[189094]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:49 compute-1 systemd-coredump[188943]: Process 169640 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 42:
                                                    #0  0x00007f5d4c12432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:53:50 compute-1 systemd[1]: systemd-coredump@6-188941-0.service: Deactivated successfully.
Sep 30 17:53:50 compute-1 podman[189126]: 2025-09-30 17:53:50.160322698 +0000 UTC m=+0.048440102 container died a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Sep 30 17:53:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-edcdb108f17847ac15e2b2053673846ab0f8770411cf45373d92d5f2034fede5-merged.mount: Deactivated successfully.
Sep 30 17:53:50 compute-1 podman[189126]: 2025-09-30 17:53:50.226053698 +0000 UTC m=+0.114171112 container remove a2d70f43c6feac4adc415a27cb9930ea9dd1ce238522a7abdba041dd435d826c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 17:53:50 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:53:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:50 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:53:50 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.582s CPU time.
Sep 30 17:53:50 compute-1 sudo[189293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgpnwnmpnsvybcnrszupckkywvtzdusi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254830.1829996-1531-231133579662725/AnsiballZ_stat.py'
Sep 30 17:53:50 compute-1 sudo[189293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:50 compute-1 ceph-mon[75484]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:50 compute-1 python3.9[189295]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:50 compute-1 sudo[189293]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:51 compute-1 sudo[189417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esdgnldpblsmpskhaqlbngaihrhyiaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254830.1829996-1531-231133579662725/AnsiballZ_copy.py'
Sep 30 17:53:51 compute-1 sudo[189417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:51 compute-1 sshd-session[188610]: Failed password for invalid user scsadmin from 175.126.165.170 port 38062 ssh2
Sep 30 17:53:51 compute-1 python3.9[189419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254830.1829996-1531-231133579662725/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:51 compute-1 sudo[189417]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:51.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:51.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:51 compute-1 sudo[189569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjandjbuvelpeexdwrnjplftldvopqrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254831.5738463-1531-92033046583601/AnsiballZ_stat.py'
Sep 30 17:53:51 compute-1 sudo[189569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:51 compute-1 sshd-session[188610]: Received disconnect from 175.126.165.170 port 38062:11: Bye Bye [preauth]
Sep 30 17:53:51 compute-1 sshd-session[188610]: Disconnected from invalid user scsadmin 175.126.165.170 port 38062 [preauth]
Sep 30 17:53:52 compute-1 python3.9[189571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:52 compute-1 sudo[189569]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:52 compute-1 sudo[189693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbkluheyvhlijznrgjjmmoscjszrbthe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254831.5738463-1531-92033046583601/AnsiballZ_copy.py'
Sep 30 17:53:52 compute-1 sudo[189693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:52 compute-1 ceph-mon[75484]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:53:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:53:52 compute-1 python3.9[189695]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254831.5738463-1531-92033046583601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:52 compute-1 sudo[189693]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:53 compute-1 sudo[189848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjdzxpsacxcnhhgnjfvklshandurfakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254832.9275055-1531-79031981953378/AnsiballZ_stat.py'
Sep 30 17:53:53 compute-1 sudo[189848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:53 compute-1 python3.9[189850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:53 compute-1 sudo[189848]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:53.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:53.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:53 compute-1 sudo[189972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfqjrhlkebykpwbqtcwnhcynjjrwwvix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254832.9275055-1531-79031981953378/AnsiballZ_copy.py'
Sep 30 17:53:53 compute-1 sudo[189972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:54 compute-1 sshd-session[189777]: Invalid user salma from 194.107.115.65 port 59664
Sep 30 17:53:54 compute-1 sshd-session[189777]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:53:54 compute-1 sshd-session[189777]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 17:53:54 compute-1 python3.9[189974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254832.9275055-1531-79031981953378/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:54 compute-1 sudo[189972]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:53:54.292 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:53:54.293 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:53:54.293 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:53:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:54 compute-1 sudo[190127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuyjfbqwiketsmcsocyztgdpnftzyczh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254834.240824-1531-169083637507524/AnsiballZ_stat.py'
Sep 30 17:53:54 compute-1 sudo[190127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:54 compute-1 ceph-mon[75484]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:53:54 compute-1 python3.9[190130]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:54 compute-1 sudo[190127]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175355 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:53:55 compute-1 sudo[190251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drupcjuyxxjaupcnmojniqmtyevjriph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254834.240824-1531-169083637507524/AnsiballZ_copy.py'
Sep 30 17:53:55 compute-1 sudo[190251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:55 compute-1 python3.9[190253]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254834.240824-1531-169083637507524/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:55 compute-1 sudo[190251]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:55.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:55.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:55 compute-1 sudo[190403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqbgvatitzyifidclquziyqywolhtxky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254835.597317-1531-212443085298666/AnsiballZ_stat.py'
Sep 30 17:53:55 compute-1 sudo[190403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:56 compute-1 sshd-session[189777]: Failed password for invalid user salma from 194.107.115.65 port 59664 ssh2
Sep 30 17:53:56 compute-1 unix_chkpwd[190407]: password check failed for user (root)
Sep 30 17:53:56 compute-1 sshd-session[189947]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:53:56 compute-1 python3.9[190405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:56 compute-1 sudo[190403]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:56 compute-1 podman[190502]: 2025-09-30 17:53:56.509944084 +0000 UTC m=+0.072678562 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 17:53:56 compute-1 sudo[190547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bphihszrdmsjxjovusbiamvvtontvqwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254835.597317-1531-212443085298666/AnsiballZ_copy.py'
Sep 30 17:53:56 compute-1 sudo[190547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:56 compute-1 ceph-mon[75484]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:53:56 compute-1 python3.9[190549]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254835.597317-1531-212443085298666/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:56 compute-1 sudo[190547]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:57 compute-1 sudo[190700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuwiaahvqoddydltnubrrurzfsaqyooi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254836.9036314-1531-40198338064391/AnsiballZ_stat.py'
Sep 30 17:53:57 compute-1 sudo[190700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:57 compute-1 python3.9[190702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:57 compute-1 sudo[190700]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:57.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:57 compute-1 sshd-session[189947]: Failed password for root from 192.210.160.141 port 34292 ssh2
Sep 30 17:53:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:57.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:57 compute-1 sudo[190823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdfjsxiilqkhlqnosbhlizagmoukhbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254836.9036314-1531-40198338064391/AnsiballZ_copy.py'
Sep 30 17:53:57 compute-1 sudo[190823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:58 compute-1 python3.9[190825]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254836.9036314-1531-40198338064391/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:53:58 compute-1 sudo[190823]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:58 compute-1 sshd-session[189777]: Received disconnect from 194.107.115.65 port 59664:11: Bye Bye [preauth]
Sep 30 17:53:58 compute-1 sshd-session[189777]: Disconnected from invalid user salma 194.107.115.65 port 59664 [preauth]
Sep 30 17:53:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:58 compute-1 sudo[190978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopskjtqngjjzcsqkxdplarzznrvlfty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254838.2319152-1531-56477630726546/AnsiballZ_stat.py'
Sep 30 17:53:58 compute-1 sudo[190978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:58 compute-1 ceph-mon[75484]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:53:58 compute-1 python3.9[190980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:53:58 compute-1 sudo[190978]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:59 compute-1 sudo[191102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wopvtjbxuolqiqilpncvtoukizvkfmid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254838.2319152-1531-56477630726546/AnsiballZ_copy.py'
Sep 30 17:53:59 compute-1 sudo[191102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:59 compute-1 sshd-session[189947]: Connection closed by authenticating user root 192.210.160.141 port 34292 [preauth]
Sep 30 17:53:59 compute-1 python3.9[191104]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254838.2319152-1531-56477630726546/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:53:59 compute-1 sudo[191102]: pam_unix(sudo:session): session closed for user root
Sep 30 17:53:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:53:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:53:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:53:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:53:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:53:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:53:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:53:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:53:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:53:59.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:53:59 compute-1 sudo[191254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klbbwzscobdxralnjacugharnmdsrdcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254839.4904945-1531-180027628427447/AnsiballZ_stat.py'
Sep 30 17:53:59 compute-1 sudo[191254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:53:59 compute-1 sshd-session[190926]: Invalid user sanjay from 103.153.190.105 port 37695
Sep 30 17:53:59 compute-1 sshd-session[190926]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:53:59 compute-1 sshd-session[190926]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 17:54:00 compute-1 python3.9[191256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:00 compute-1 sudo[191254]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:00 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 7.
Sep 30 17:54:00 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:54:00 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.582s CPU time.
Sep 30 17:54:00 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:54:00 compute-1 sudo[191380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imkflwannijusvqtuogztohuaffybvvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254839.4904945-1531-180027628427447/AnsiballZ_copy.py'
Sep 30 17:54:00 compute-1 sudo[191380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:00 compute-1 ceph-mon[75484]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:54:00 compute-1 python3.9[191383]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254839.4904945-1531-180027628427447/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:00 compute-1 sudo[191380]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:00 compute-1 podman[191452]: 2025-09-30 17:54:00.844309209 +0000 UTC m=+0.073946056 container create 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Sep 30 17:54:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d59e3bcfc7f85ccd733016d271858d475cdb0d7b514acc93a815c10ac35de8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:54:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d59e3bcfc7f85ccd733016d271858d475cdb0d7b514acc93a815c10ac35de8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:54:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d59e3bcfc7f85ccd733016d271858d475cdb0d7b514acc93a815c10ac35de8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:54:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d59e3bcfc7f85ccd733016d271858d475cdb0d7b514acc93a815c10ac35de8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:54:00 compute-1 podman[191452]: 2025-09-30 17:54:00.814402324 +0000 UTC m=+0.044039241 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:54:00 compute-1 podman[191452]: 2025-09-30 17:54:00.910350939 +0000 UTC m=+0.139987866 container init 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Sep 30 17:54:00 compute-1 podman[191452]: 2025-09-30 17:54:00.921913934 +0000 UTC m=+0.151550801 container start 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Sep 30 17:54:00 compute-1 bash[191452]: 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:54:00 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:00 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:54:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:01 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:54:01 compute-1 sudo[191634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjtokgtmzunikckurnmisratmnpysxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254840.8598363-1531-187775455071831/AnsiballZ_stat.py'
Sep 30 17:54:01 compute-1 sudo[191634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:01 compute-1 python3.9[191636]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:01 compute-1 sudo[191634]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:01.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:01.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:01 compute-1 sudo[191757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqdpfvqzovvtqzoyisbxvfncgugnwvmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254840.8598363-1531-187775455071831/AnsiballZ_copy.py'
Sep 30 17:54:01 compute-1 sudo[191757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:02 compute-1 python3.9[191759]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254840.8598363-1531-187775455071831/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:02 compute-1 sudo[191757]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:02 compute-1 sshd-session[190926]: Failed password for invalid user sanjay from 103.153.190.105 port 37695 ssh2
Sep 30 17:54:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:02 compute-1 sudo[191910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-madlqkujrlkzeqkriycbutsavmpheqcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254842.207132-1531-145979166137582/AnsiballZ_stat.py'
Sep 30 17:54:02 compute-1 sudo[191910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:02 compute-1 ceph-mon[75484]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:54:02 compute-1 python3.9[191912]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:02 compute-1 sudo[191910]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:03 compute-1 sudo[192034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgjksirblvfpmmphmrcojxbkygdhzbwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254842.207132-1531-145979166137582/AnsiballZ_copy.py'
Sep 30 17:54:03 compute-1 sudo[192034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:03 compute-1 python3.9[192036]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254842.207132-1531-145979166137582/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:03 compute-1 sudo[192034]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:03 compute-1 sudo[192063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:54:03 compute-1 sudo[192063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:03 compute-1 sudo[192063]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:03.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:03.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:03 compute-1 sudo[192213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yavisoqzolprimoqcavhzfsuaqtamwwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254843.5563462-1531-205472843994661/AnsiballZ_stat.py'
Sep 30 17:54:03 compute-1 sudo[192213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:03 compute-1 unix_chkpwd[192217]: password check failed for user (root)
Sep 30 17:54:03 compute-1 sshd-session[192037]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 17:54:04 compute-1 python3.9[192215]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:04 compute-1 sudo[192213]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:04 compute-1 sshd-session[192241]: Invalid user lruiz from 107.172.146.104 port 54922
Sep 30 17:54:04 compute-1 sshd-session[192241]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:04 compute-1 sshd-session[192241]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:54:04 compute-1 sudo[192340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuxzzzybwkquzhhuopkysfrefacisfwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254843.5563462-1531-205472843994661/AnsiballZ_copy.py'
Sep 30 17:54:04 compute-1 sshd-session[190926]: Received disconnect from 103.153.190.105 port 37695:11: Bye Bye [preauth]
Sep 30 17:54:04 compute-1 sshd-session[190926]: Disconnected from invalid user sanjay 103.153.190.105 port 37695 [preauth]
Sep 30 17:54:04 compute-1 sudo[192340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:04 compute-1 ceph-mon[75484]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:54:04 compute-1 python3.9[192343]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254843.5563462-1531-205472843994661/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:04 compute-1 sudo[192340]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:05 compute-1 sudo[192493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpdbrnjyogccyaavdrgqknumtahosorg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254844.9820774-1531-280208574262001/AnsiballZ_stat.py'
Sep 30 17:54:05 compute-1 sudo[192493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:05 compute-1 python3.9[192495]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:05 compute-1 sudo[192493]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:05.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:05.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:05 compute-1 sudo[192616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aggflfiemyxqvwaatfrxlbwixzysghtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254844.9820774-1531-280208574262001/AnsiballZ_copy.py'
Sep 30 17:54:05 compute-1 sudo[192616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:06 compute-1 python3.9[192619]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254844.9820774-1531-280208574262001/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:06 compute-1 sudo[192616]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:06 compute-1 sshd-session[192037]: Failed password for root from 167.172.43.167 port 41124 ssh2
Sep 30 17:54:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:06 compute-1 sshd-session[192241]: Failed password for invalid user lruiz from 107.172.146.104 port 54922 ssh2
Sep 30 17:54:06 compute-1 sudo[192770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiaizprcwxltflvpozoiozkibamnvazi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254846.3435738-1531-255512733675620/AnsiballZ_stat.py'
Sep 30 17:54:06 compute-1 sudo[192770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:06 compute-1 ceph-mon[75484]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:54:06 compute-1 sshd-session[192037]: Received disconnect from 167.172.43.167 port 41124:11: Bye Bye [preauth]
Sep 30 17:54:06 compute-1 sshd-session[192037]: Disconnected from authenticating user root 167.172.43.167 port 41124 [preauth]
Sep 30 17:54:06 compute-1 python3.9[192772]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:06 compute-1 sudo[192770]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:54:07 compute-1 sshd-session[192241]: Received disconnect from 107.172.146.104 port 54922:11: Bye Bye [preauth]
Sep 30 17:54:07 compute-1 sshd-session[192241]: Disconnected from invalid user lruiz 107.172.146.104 port 54922 [preauth]
Sep 30 17:54:07 compute-1 sudo[192893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpttgsfgnozjycychxrknvnxkrbhjarq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254846.3435738-1531-255512733675620/AnsiballZ_copy.py'
Sep 30 17:54:07 compute-1 sudo[192893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:07 compute-1 python3.9[192895]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254846.3435738-1531-255512733675620/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:07 compute-1 sudo[192893]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 17:54:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:07.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 17:54:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:54:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:07.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:08 compute-1 sudo[193046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erquaglbqovtrwpkssnxcxoltpaftcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254847.7265248-1531-101081745713502/AnsiballZ_stat.py'
Sep 30 17:54:08 compute-1 sudo[193046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:08 compute-1 python3.9[193048]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:08 compute-1 sudo[193046]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:08 compute-1 sudo[193170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbllqmtcsfqsousmidvfrvcvfujqtxrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254847.7265248-1531-101081745713502/AnsiballZ_copy.py'
Sep 30 17:54:08 compute-1 sudo[193170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:08 compute-1 ceph-mon[75484]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:54:08 compute-1 python3.9[193172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254847.7265248-1531-101081745713502/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:08 compute-1 sudo[193170]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:09.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:09 compute-1 python3.9[193322]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:09.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:10 compute-1 sudo[193476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovpvkkwxobawiphcghajyiygxauniob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254849.9133902-1943-89840943355529/AnsiballZ_seboolean.py'
Sep 30 17:54:10 compute-1 sudo[193476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:10 compute-1 python3.9[193478]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Sep 30 17:54:10 compute-1 ceph-mon[75484]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:54:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:11.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:11.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:11 compute-1 sudo[193476]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:12 compute-1 sudo[193634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycyzpwzlvdukvarmajfezotflrfpnhpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254852.0641763-1959-141836157257110/AnsiballZ_copy.py'
Sep 30 17:54:12 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Sep 30 17:54:12 compute-1 sudo[193634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:12 compute-1 python3.9[193636]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:12 compute-1 sudo[193634]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:12 compute-1 ceph-mon[75484]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:54:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:54:13 compute-1 sudo[193799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azajplduodmykqofwumgjipafpvmmywp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254852.7776918-1959-202479908360156/AnsiballZ_copy.py'
Sep 30 17:54:13 compute-1 sudo[193799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:13 compute-1 python3.9[193801]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:13 compute-1 sudo[193799]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:13.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:13.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:13 compute-1 sudo[193955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeluaaggepluxachfqgzlqxbxbiwmkbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254853.5466673-1959-94751659085465/AnsiballZ_copy.py'
Sep 30 17:54:13 compute-1 sudo[193955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:14 compute-1 python3.9[193957]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:14 compute-1 sudo[193955]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:14 compute-1 sudo[194110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbpvfwpaatjjmvickkmnoeaidizqzfjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254854.2171082-1959-230242765247916/AnsiballZ_copy.py'
Sep 30 17:54:14 compute-1 sudo[194110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:14 compute-1 python3.9[194112]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:14 compute-1 sudo[194110]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:14 compute-1 ceph-mon[75484]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:15 compute-1 sudo[194263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbluoymjgunpgnlnvdsnzkhgzrgpczlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254854.9867241-1959-229658289550679/AnsiballZ_copy.py'
Sep 30 17:54:15 compute-1 sudo[194263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:15 compute-1 sshd-session[194007]: Invalid user notes from 14.225.167.110 port 32780
Sep 30 17:54:15 compute-1 sshd-session[194007]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:15 compute-1 sshd-session[194007]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 17:54:15 compute-1 python3.9[194265]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:15 compute-1 sudo[194263]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:15.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:15.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:16 compute-1 sudo[194416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbwmlalmhrglfjqmnjabwgsbaxddjsee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254855.8416293-2031-242133923694790/AnsiballZ_copy.py'
Sep 30 17:54:16 compute-1 sudo[194416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:16 compute-1 python3.9[194418]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:16 compute-1 sudo[194416]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:16 compute-1 podman[194419]: 2025-09-30 17:54:16.584897044 +0000 UTC m=+0.143576569 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 17:54:16 compute-1 ceph-mon[75484]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:54:16 compute-1 sudo[194593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwmscellkwqqilejdqohzganfjlptabw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254856.6366127-2031-19191401832836/AnsiballZ_copy.py'
Sep 30 17:54:16 compute-1 sudo[194593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175417 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:17 compute-1 python3.9[194595]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:17 compute-1 sudo[194593]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:17 compute-1 sshd-session[194007]: Failed password for invalid user notes from 14.225.167.110 port 32780 ssh2
Sep 30 17:54:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:17.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:17 compute-1 sudo[194745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaoafshtarywpcxnkesnfkynslwnoejy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254857.3386877-2031-152166090387105/AnsiballZ_copy.py'
Sep 30 17:54:17 compute-1 sudo[194745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:17 compute-1 sshd-session[194007]: Received disconnect from 14.225.167.110 port 32780:11: Bye Bye [preauth]
Sep 30 17:54:17 compute-1 sshd-session[194007]: Disconnected from invalid user notes 14.225.167.110 port 32780 [preauth]
Sep 30 17:54:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:17.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:17 compute-1 python3.9[194748]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:17 compute-1 sudo[194745]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:18 compute-1 sudo[194900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evpibymmtdnmtycnssvfvoppuptslcge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254858.0919812-2031-96247731301599/AnsiballZ_copy.py'
Sep 30 17:54:18 compute-1 sudo[194900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:18 compute-1 python3.9[194902]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:18 compute-1 sudo[194900]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:18 compute-1 ceph-mon[75484]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:19 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:19 compute-1 sudo[195053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nywbjcpbiiaojvplftwxapouotqizcxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254858.8743296-2031-238178472497072/AnsiballZ_copy.py'
Sep 30 17:54:19 compute-1 sudo[195053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:19 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:19 compute-1 python3.9[195055]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:19 compute-1 sudo[195053]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:19 compute-1 unix_chkpwd[195057]: password check failed for user (root)
Sep 30 17:54:19 compute-1 sshd-session[194746]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:54:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:19.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:19.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:20 compute-1 sudo[195207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjzjjcfyvopgfkixtyrdpaabreyuplka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254859.6442657-2103-215967385994589/AnsiballZ_systemd.py'
Sep 30 17:54:20 compute-1 sudo[195207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:20 compute-1 python3.9[195209]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:54:20 compute-1 systemd[1]: Reloading.
Sep 30 17:54:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:20 compute-1 systemd-rc-local-generator[195236]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:54:20 compute-1 systemd-sysv-generator[195239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:54:20 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Sep 30 17:54:20 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Sep 30 17:54:20 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Sep 30 17:54:20 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Sep 30 17:54:20 compute-1 systemd[1]: Starting libvirt logging daemon...
Sep 30 17:54:20 compute-1 systemd[1]: Started libvirt logging daemon.
Sep 30 17:54:20 compute-1 sudo[195207]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:20 compute-1 ceph-mon[75484]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:21 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:21 compute-1 sshd-session[194746]: Failed password for root from 192.210.160.141 port 54552 ssh2
Sep 30 17:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:21 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:21 compute-1 sudo[195402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urvxgdggosbidzlmeezmtxwjgaommnus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254861.0937018-2103-210977058624331/AnsiballZ_systemd.py'
Sep 30 17:54:21 compute-1 sudo[195402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:21.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:21 compute-1 python3.9[195404]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:54:21 compute-1 systemd[1]: Reloading.
Sep 30 17:54:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:21.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:21 compute-1 systemd-rc-local-generator[195433]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:54:21 compute-1 systemd-sysv-generator[195437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:54:22 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Sep 30 17:54:22 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Sep 30 17:54:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Sep 30 17:54:22 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Sep 30 17:54:22 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Sep 30 17:54:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Sep 30 17:54:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Sep 30 17:54:22 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 17:54:22 compute-1 systemd[1]: Started libvirt nodedev daemon.
Sep 30 17:54:22 compute-1 sudo[195402]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:22 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Sep 30 17:54:22 compute-1 sshd-session[195353]: Invalid user open from 216.10.242.161 port 57666
Sep 30 17:54:22 compute-1 sshd-session[195353]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:22 compute-1 sshd-session[195353]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 17:54:22 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Sep 30 17:54:22 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Sep 30 17:54:22 compute-1 sshd-session[194746]: Connection closed by authenticating user root 192.210.160.141 port 54552 [preauth]
Sep 30 17:54:22 compute-1 sudo[195629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfogiwrxarohixcuyntsubtcsvtaookm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254862.3715851-2103-270296987084856/AnsiballZ_systemd.py'
Sep 30 17:54:22 compute-1 sudo[195629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:22 compute-1 ceph-mon[75484]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:54:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:23 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:23 compute-1 python3.9[195631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:54:23 compute-1 systemd[1]: Reloading.
Sep 30 17:54:23 compute-1 systemd-sysv-generator[195661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:54:23 compute-1 systemd-rc-local-generator[195655]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:23 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:23 compute-1 setroubleshoot[195444]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 23839666-5c85-49b5-95dc-a7e8745990d5
Sep 30 17:54:23 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Sep 30 17:54:23 compute-1 setroubleshoot[195444]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 17:54:23 compute-1 setroubleshoot[195444]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 23839666-5c85-49b5-95dc-a7e8745990d5
Sep 30 17:54:23 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Sep 30 17:54:23 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Sep 30 17:54:23 compute-1 setroubleshoot[195444]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 17:54:23 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Sep 30 17:54:23 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 17:54:23 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 17:54:23 compute-1 sudo[195629]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:23 compute-1 sudo[195692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:54:23 compute-1 sudo[195692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:23 compute-1 sudo[195692]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:23.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:23.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:23 compute-1 ceph-mon[75484]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:54:24 compute-1 sudo[195867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsyanwbomlcqttmozmklpulzyeydwupr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254863.8908594-2103-123562430500672/AnsiballZ_systemd.py'
Sep 30 17:54:24 compute-1 sudo[195867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:24 compute-1 sshd-session[195353]: Failed password for invalid user open from 216.10.242.161 port 57666 ssh2
Sep 30 17:54:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:24 compute-1 sshd-session[195353]: Received disconnect from 216.10.242.161 port 57666:11: Bye Bye [preauth]
Sep 30 17:54:24 compute-1 sshd-session[195353]: Disconnected from invalid user open 216.10.242.161 port 57666 [preauth]
Sep 30 17:54:24 compute-1 python3.9[195869]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:54:24 compute-1 systemd[1]: Reloading.
Sep 30 17:54:24 compute-1 systemd-rc-local-generator[195895]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:54:24 compute-1 systemd-sysv-generator[195899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:54:25 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Sep 30 17:54:25 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Sep 30 17:54:25 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Sep 30 17:54:25 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Sep 30 17:54:25 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Sep 30 17:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:25 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:25 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Sep 30 17:54:25 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Sep 30 17:54:25 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Sep 30 17:54:25 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Sep 30 17:54:25 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Sep 30 17:54:25 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 17:54:25 compute-1 systemd[1]: Started libvirt QEMU daemon.
Sep 30 17:54:25 compute-1 sudo[195867]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:25 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:25.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:25 compute-1 sudo[196082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zichcizvlvxnamovkofbnxpkomtqtsfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254865.363641-2103-67366428001748/AnsiballZ_systemd.py'
Sep 30 17:54:25 compute-1 sudo[196082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:25.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:26 compute-1 python3.9[196084]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:54:26 compute-1 systemd[1]: Reloading.
Sep 30 17:54:26 compute-1 systemd-rc-local-generator[196114]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:54:26 compute-1 systemd-sysv-generator[196119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:54:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:26 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Sep 30 17:54:26 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Sep 30 17:54:26 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Sep 30 17:54:26 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Sep 30 17:54:26 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Sep 30 17:54:26 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Sep 30 17:54:26 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 17:54:26 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 17:54:26 compute-1 ceph-mon[75484]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:54:26 compute-1 sudo[196082]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:26 compute-1 podman[196144]: 2025-09-30 17:54:26.667142179 +0000 UTC m=+0.089485985 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 17:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:27 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:27 compute-1 sudo[196313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqzynwfobeswzoefpthdpnsfuwbpjfuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254866.8952107-2177-280756334213654/AnsiballZ_file.py'
Sep 30 17:54:27 compute-1 sudo[196313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:27 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:27 compute-1 python3.9[196315]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:27 compute-1 sudo[196313]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:27.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:27.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:28 compute-1 sudo[196466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpnioejwzkrjvmuluadqkbbhmeqnpddr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254867.7432568-2193-90990662722289/AnsiballZ_find.py'
Sep 30 17:54:28 compute-1 sudo[196466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175428 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:54:28 compute-1 python3.9[196468]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:54:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:28 compute-1 sudo[196466]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:28 compute-1 ceph-mon[75484]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:54:29 compute-1 sudo[196620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdafttqiqlihuhkhwuiozkofmrpiodro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254868.6711478-2209-124585110298460/AnsiballZ_command.py'
Sep 30 17:54:29 compute-1 sudo[196620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:29 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:29 compute-1 python3.9[196622]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:29 compute-1 sudo[196620]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:29 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:29.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:29.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:30 compute-1 python3.9[196776]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:54:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:30 compute-1 ceph-mon[75484]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:31 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:31 compute-1 python3.9[196928]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:31 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:31.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:31 compute-1 python3.9[197049]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254870.5167267-2247-230197514851920/.source.xml follow=False _original_basename=secret.xml.j2 checksum=5cde69df6d2b570990e604ddf8058f3ae944d5fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:31.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:32 compute-1 sudo[197200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txybduueccyovbemjjfqmzbldmbhytql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254871.9295294-2277-224804035498865/AnsiballZ_command.py'
Sep 30 17:54:32 compute-1 sudo[197200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:32 compute-1 python3.9[197202]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 63d32c6a-fa18-54ed-8711-9a3915cc367b
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:32 compute-1 polkitd[6874]: Registered Authentication Agent for unix-process:197204:1271415 (system bus name :1.2033 [/usr/bin/pkttyagent --process 197204 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 17:54:32 compute-1 polkitd[6874]: Unregistered Authentication Agent for unix-process:197204:1271415 (system bus name :1.2033, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 17:54:32 compute-1 polkitd[6874]: Registered Authentication Agent for unix-process:197203:1271415 (system bus name :1.2034 [/usr/bin/pkttyagent --process 197203 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 17:54:32 compute-1 polkitd[6874]: Unregistered Authentication Agent for unix-process:197203:1271415 (system bus name :1.2034, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 17:54:32 compute-1 sudo[197200]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:32 compute-1 ceph-mon[75484]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:33 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:33 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:33 compute-1 python3.9[197365]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:33 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Sep 30 17:54:33 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Sep 30 17:54:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:33.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:33.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:34 compute-1 sudo[197516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnryurpidhdnbcusolhfqghcttpsrsnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254873.8192348-2309-192844076219420/AnsiballZ_command.py'
Sep 30 17:54:34 compute-1 sudo[197516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:34 compute-1 sudo[197516]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:34 compute-1 ceph-mon[75484]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:54:35 compute-1 sudo[197670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlnafxuqzlmqrlqnkbjjxjncylrsmior ; FSID=63d32c6a-fa18-54ed-8711-9a3915cc367b KEY=AQDxFNxoAAAAABAAAVqrvevrN1uM+kO3r0Scwg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254874.6545224-2325-160436601670016/AnsiballZ_command.py'
Sep 30 17:54:35 compute-1 sudo[197670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:35 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:35 compute-1 polkitd[6874]: Registered Authentication Agent for unix-process:197673:1271685 (system bus name :1.2037 [/usr/bin/pkttyagent --process 197673 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 17:54:35 compute-1 polkitd[6874]: Unregistered Authentication Agent for unix-process:197673:1271685 (system bus name :1.2037, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 17:54:35 compute-1 sudo[197670]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:35 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:35.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:35.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:35 compute-1 sudo[197828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxjrzqixysvzvbkdxtmbfzxdjfcmoiav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254875.5421534-2341-140420438007464/AnsiballZ_copy.py'
Sep 30 17:54:35 compute-1 sudo[197828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:36 compute-1 python3.9[197830]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:36 compute-1 sudo[197828]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:36 compute-1 sudo[197982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxcnqjdgpvyefaxredoonwtlklttifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254876.316599-2357-140099689500557/AnsiballZ_stat.py'
Sep 30 17:54:36 compute-1 sudo[197982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:36 compute-1 ceph-mon[75484]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:54:36 compute-1 python3.9[197984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:36 compute-1 sudo[197982]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:37 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:37 compute-1 sudo[198105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocvqhayjkkzeuaaxspkjniibmdjqkpgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254876.316599-2357-140099689500557/AnsiballZ_copy.py'
Sep 30 17:54:37 compute-1 sudo[198105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:37 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:37 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:54:37 compute-1 python3.9[198107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254876.316599-2357-140099689500557/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:37 compute-1 sudo[198105]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:37.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:54:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:37.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:38 compute-1 sudo[198258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fekknktbvnysdmpppjrnpysifgpgnqec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254877.8141773-2389-94544162050784/AnsiballZ_file.py'
Sep 30 17:54:38 compute-1 sudo[198258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:38 compute-1 python3.9[198260]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:38 compute-1 sudo[198258]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:38 compute-1 ceph-mon[75484]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:54:39 compute-1 sudo[198411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlmvxlozvhruqhdzumahejlxokwsflzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254878.6886368-2405-167801124384451/AnsiballZ_stat.py'
Sep 30 17:54:39 compute-1 sudo[198411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:39 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:39 compute-1 python3.9[198413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:39 compute-1 sudo[198411]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:39 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:39 compute-1 sudo[198489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-envzgwnincjwzmdahmgcjgjprmwjlryo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254878.6886368-2405-167801124384451/AnsiballZ_file.py'
Sep 30 17:54:39 compute-1 sudo[198489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:39 compute-1 sshd-session[196595]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:54:39 compute-1 sshd-session[196595]: banner exchange: Connection from 14.103.129.43 port 55632: Connection timed out
Sep 30 17:54:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:39.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:39 compute-1 python3.9[198491]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:39 compute-1 sudo[198489]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:39.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:40 compute-1 sudo[198644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuiykpuadimelbhxfkzhanxvuzxmdcup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254879.9786842-2429-131433436871714/AnsiballZ_stat.py'
Sep 30 17:54:40 compute-1 sudo[198644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:40 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:40 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:54:40 compute-1 python3.9[198646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:40 compute-1 sudo[198644]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:40 compute-1 ceph-mon[75484]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:54:40 compute-1 sudo[198723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmjzxvjdqckirrhnqnrtcfyjoqktfjjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254879.9786842-2429-131433436871714/AnsiballZ_file.py'
Sep 30 17:54:40 compute-1 sudo[198723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:41 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:41 compute-1 python3.9[198725]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.oz2wv_t9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:41 compute-1 sudo[198723]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:41 compute-1 sshd-session[198592]: Invalid user tx from 84.51.43.58 port 64023
Sep 30 17:54:41 compute-1 sshd-session[198592]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:41 compute-1 sshd-session[198592]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 17:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:41 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:41.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:41 compute-1 sudo[198877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmekogqrsqnmlzdynjdyctcpzxvaxtjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254881.3792682-2453-272208278430913/AnsiballZ_stat.py'
Sep 30 17:54:41 compute-1 sudo[198877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:41.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:41 compute-1 python3.9[198879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:42 compute-1 sudo[198877]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:42 compute-1 sudo[198956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uytnnktxtjtfsocxacmzrcsmwhjrlxsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254881.3792682-2453-272208278430913/AnsiballZ_file.py'
Sep 30 17:54:42 compute-1 sudo[198956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:42 compute-1 python3.9[198958]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:42 compute-1 sudo[198956]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:42 compute-1 ceph-mon[75484]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 17:54:42 compute-1 sshd-session[198592]: Failed password for invalid user tx from 84.51.43.58 port 64023 ssh2
Sep 30 17:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:43 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:43 compute-1 sudo[199109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xskgialxfuxqsdotitpfaxjxkbbtepaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254882.7764935-2479-42912978078827/AnsiballZ_command.py'
Sep 30 17:54:43 compute-1 sudo[199109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:43 compute-1 sshd-session[198726]: Invalid user dev from 192.210.160.141 port 39746
Sep 30 17:54:43 compute-1 sshd-session[198726]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:43 compute-1 sshd-session[198726]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 17:54:43 compute-1 sshd-session[198592]: Received disconnect from 84.51.43.58 port 64023:11: Bye Bye [preauth]
Sep 30 17:54:43 compute-1 sshd-session[198592]: Disconnected from invalid user tx 84.51.43.58 port 64023 [preauth]
Sep 30 17:54:43 compute-1 python3.9[199111]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:43 compute-1 sudo[199109]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:43 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:43 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:54:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:43.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:43 compute-1 sudo[199189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:54:43 compute-1 sudo[199189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:43 compute-1 sudo[199189]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:43.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:44 compute-1 sudo[199288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htopcaapzsryitwgzbqtjlwnnzxwedhd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254883.5592463-2495-77561720109102/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 17:54:44 compute-1 sudo[199288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:44 compute-1 python3[199290]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 17:54:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:44 compute-1 sudo[199288]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:44 compute-1 ceph-mon[75484]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:54:45 compute-1 sshd-session[198726]: Failed password for invalid user dev from 192.210.160.141 port 39746 ssh2
Sep 30 17:54:45 compute-1 sudo[199441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prikrmhayrsmamiagqtavitzaqnrllqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254884.6108763-2511-193648350270751/AnsiballZ_stat.py'
Sep 30 17:54:45 compute-1 sudo[199441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:45 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:45 compute-1 python3.9[199443]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:45 compute-1 sudo[199441]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:45 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:45 compute-1 sudo[199521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzipcubumpcmzsbeoaubpbelznvrinnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254884.6108763-2511-193648350270751/AnsiballZ_file.py'
Sep 30 17:54:45 compute-1 sudo[199521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:45 compute-1 sshd-session[198726]: Connection closed by invalid user dev 192.210.160.141 port 39746 [preauth]
Sep 30 17:54:45 compute-1 python3.9[199523]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:45 compute-1 sudo[199521]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:45.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:46 compute-1 sudo[199674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utvjnikpyuuessxoeszsbzquheygjltm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254886.0414484-2536-183595757346427/AnsiballZ_stat.py'
Sep 30 17:54:46 compute-1 sudo[199674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:46 compute-1 python3.9[199676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:46 compute-1 sudo[199674]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:46 compute-1 ceph-mon[75484]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:54:46 compute-1 sudo[199766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyagqxjqbbtyizcwsvacafgjmzluvrvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254886.0414484-2536-183595757346427/AnsiballZ_file.py'
Sep 30 17:54:46 compute-1 sudo[199766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:47 compute-1 podman[199727]: 2025-09-30 17:54:47.026080914 +0000 UTC m=+0.134180296 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller)
Sep 30 17:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:47 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:47 compute-1 python3.9[199772]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:47 compute-1 sudo[199766]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:47 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003430 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:47.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:47 compute-1 sudo[199930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ernwbknmxhawyvocgcsswicbpqocurfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254887.40915-2560-58039283601335/AnsiballZ_stat.py'
Sep 30 17:54:47 compute-1 sudo[199930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:47.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:48 compute-1 python3.9[199932]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:48 compute-1 sudo[199930]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:48 compute-1 sudo[200009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqyhrkpeuiowojbhxqkxmerobimorxru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254887.40915-2560-58039283601335/AnsiballZ_file.py'
Sep 30 17:54:48 compute-1 sudo[200009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:48 compute-1 python3.9[200011]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:48 compute-1 sudo[200009]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:48 compute-1 sudo[200055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:54:48 compute-1 sudo[200055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:48 compute-1 sudo[200055]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:48 compute-1 ceph-mon[75484]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:54:48 compute-1 sudo[200108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:54:48 compute-1 sudo[200108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:49 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d0000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:49 compute-1 sudo[200224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnmrcjsluwbmvdgkudauomvaikgkoprm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254888.775838-2583-229560080161585/AnsiballZ_stat.py'
Sep 30 17:54:49 compute-1 sudo[200224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:49 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0000d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:49 compute-1 python3.9[200226]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:49 compute-1 sudo[200224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:49 compute-1 sudo[200108]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:49.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:49 compute-1 sudo[200321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgvlytjbdaznkevayusfmaniqfnyisxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254888.775838-2583-229560080161585/AnsiballZ_file.py'
Sep 30 17:54:49 compute-1 sudo[200321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:49.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:54:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:54:50 compute-1 python3.9[200323]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:50 compute-1 sudo[200321]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.294791) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890294845, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4224, "num_deletes": 502, "total_data_size": 10899072, "memory_usage": 11036984, "flush_reason": "Manual Compaction"}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Sep 30 17:54:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175450 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890324736, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4078032, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13395, "largest_seqno": 17614, "table_properties": {"data_size": 4067031, "index_size": 6087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3845, "raw_key_size": 30205, "raw_average_key_size": 19, "raw_value_size": 4040626, "raw_average_value_size": 2656, "num_data_blocks": 273, "num_entries": 1521, "num_filter_entries": 1521, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254484, "oldest_key_time": 1759254484, "file_creation_time": 1759254890, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 30007 microseconds, and 15616 cpu microseconds.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.324799) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4078032 bytes OK
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.324826) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.326584) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.326650) EVENT_LOG_v1 {"time_micros": 1759254890326602, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.326679) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 10880282, prev total WAL file size 10880282, number of live WAL files 2.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.330419) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353033' seq:0, type:0; will stop at (end)
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(3982KB)], [27(10080KB)]
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890330491, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 14400300, "oldest_snapshot_seqno": -1}
Sep 30 17:54:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4728 keys, 11302960 bytes, temperature: kUnknown
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890410797, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 11302960, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11270189, "index_size": 19861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11845, "raw_key_size": 118009, "raw_average_key_size": 24, "raw_value_size": 11183206, "raw_average_value_size": 2365, "num_data_blocks": 840, "num_entries": 4728, "num_filter_entries": 4728, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254890, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.411101) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 11302960 bytes
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.412677) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.2 rd, 140.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 9.8 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(6.3) write-amplify(2.8) OK, records in: 5554, records dropped: 826 output_compression: NoCompression
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.412703) EVENT_LOG_v1 {"time_micros": 1759254890412690, "job": 14, "event": "compaction_finished", "compaction_time_micros": 80381, "compaction_time_cpu_micros": 44454, "output_level": 6, "num_output_files": 1, "total_output_size": 11302960, "num_input_records": 5554, "num_output_records": 4728, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890414090, "job": 14, "event": "table_file_deletion", "file_number": 29}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254890416803, "job": 14, "event": "table_file_deletion", "file_number": 27}
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.330335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.416931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.416940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.416944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.416947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:54:50.416949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:54:50 compute-1 sudo[200475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnxnggqtkznzmfnbxxssfufuispblbrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254890.2759454-2607-32547172176746/AnsiballZ_stat.py'
Sep 30 17:54:50 compute-1 sudo[200475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:50 compute-1 ceph-mon[75484]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:54:50 compute-1 python3.9[200477]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:51 compute-1 sudo[200475]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:51 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:51 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003430 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:51 compute-1 sudo[200600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxilrsnyuqxgrklbdcfzdlldxqbtygmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254890.2759454-2607-32547172176746/AnsiballZ_copy.py'
Sep 30 17:54:51 compute-1 sudo[200600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:51 compute-1 python3.9[200602]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759254890.2759454-2607-32547172176746/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:51.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:51 compute-1 sudo[200600]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:51.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:52 compute-1 sudo[200753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldauqatiprikxxiyfxcoidfkpumixwiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254891.932361-2637-153343274434705/AnsiballZ_file.py'
Sep 30 17:54:52 compute-1 sudo[200753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:52 compute-1 python3.9[200755]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:52 compute-1 sudo[200753]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:52 compute-1 ceph-mon[75484]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:54:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:53 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d00020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:53 compute-1 sudo[200906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diyoxbzruzlnizbpmvtcfktcrxeeutyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254892.7372074-2653-26995188477647/AnsiballZ_command.py'
Sep 30 17:54:53 compute-1 sudo[200906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:53 compute-1 python3.9[200908]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:53 compute-1 sudo[200906]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:53 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:53.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:53.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:53 compute-1 sudo[201064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taglwqtrasaxrvtasbzwlienfdxcsyom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254893.5339565-2669-155013302745707/AnsiballZ_blockinfile.py'
Sep 30 17:54:53 compute-1 sudo[201064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:54 compute-1 ceph-mon[75484]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Sep 30 17:54:54 compute-1 python3.9[201066]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:54 compute-1 sudo[201064]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:54 compute-1 sshd-session[201039]: Invalid user sol from 45.148.10.240 port 40066
Sep 30 17:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:54:54.294 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:54:54.295 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:54:54.295 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:54:54 compute-1 sshd-session[201039]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:54:54 compute-1 sshd-session[201039]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.148.10.240
Sep 30 17:54:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:55 compute-1 sudo[201218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idueydgbfhzrpsztvmgzzhatohaarfre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254894.5201283-2687-194754486176081/AnsiballZ_command.py'
Sep 30 17:54:55 compute-1 sudo[201218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:55 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:55 compute-1 sudo[201221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:54:55 compute-1 sudo[201221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:54:55 compute-1 python3.9[201220]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:55 compute-1 sudo[201221]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:55 compute-1 sudo[201218]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:55 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40041d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:55.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:55.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:55 compute-1 sudo[201398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejwnafziwrpipzogonzjqvgawcxrvicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254895.505458-2703-111358885791061/AnsiballZ_stat.py'
Sep 30 17:54:55 compute-1 sudo[201398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:56 compute-1 python3.9[201400]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:54:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:54:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:54:56 compute-1 ceph-mon[75484]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:54:56 compute-1 sudo[201398]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:56 compute-1 sudo[201554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjmamdnqppeaxfuoryujuorfojvhoxuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254896.2942295-2719-156517138544049/AnsiballZ_command.py'
Sep 30 17:54:56 compute-1 sudo[201554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:56 compute-1 unix_chkpwd[201557]: password check failed for user (root)
Sep 30 17:54:56 compute-1 sshd-session[201292]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 17:54:56 compute-1 python3.9[201556]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:54:56 compute-1 sudo[201554]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:56 compute-1 sshd-session[201039]: Failed password for invalid user sol from 45.148.10.240 port 40066 ssh2
Sep 30 17:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:57 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d00020a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:57 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:57 compute-1 sudo[201722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjlukgbuyimkdemjwyrqnunffbsilpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254897.108827-2735-46856708566901/AnsiballZ_file.py'
Sep 30 17:54:57 compute-1 sudo[201722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:57 compute-1 podman[201684]: 2025-09-30 17:54:57.527807558 +0000 UTC m=+0.113363391 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 17:54:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:57.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:54:57 compute-1 python3.9[201727]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:57 compute-1 sudo[201722]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:57.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:54:58 compute-1 sudo[201882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etakxdikbgmniklophkrutiyeydaburg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254897.9569132-2751-237888063935074/AnsiballZ_stat.py'
Sep 30 17:54:58 compute-1 sudo[201882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:58 compute-1 python3.9[201884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:54:58 compute-1 sudo[201882]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:58 compute-1 ceph-mon[75484]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:54:58 compute-1 sshd-session[201039]: Connection closed by invalid user sol 45.148.10.240 port 40066 [preauth]
Sep 30 17:54:58 compute-1 sshd-session[201292]: Failed password for root from 175.126.165.170 port 58486 ssh2
Sep 30 17:54:58 compute-1 sudo[202008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mamhgluxorekdzcrlguzaojcvozdrrzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254897.9569132-2751-237888063935074/AnsiballZ_copy.py'
Sep 30 17:54:59 compute-1 sudo[202008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:59 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:59 compute-1 python3.9[202010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254897.9569132-2751-237888063935074/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:54:59 compute-1 sudo[202008]: pam_unix(sudo:session): session closed for user root
Sep 30 17:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:54:59 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40041d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:54:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:54:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:54:59 compute-1 sshd-session[201292]: Received disconnect from 175.126.165.170 port 58486:11: Bye Bye [preauth]
Sep 30 17:54:59 compute-1 sshd-session[201292]: Disconnected from authenticating user root 175.126.165.170 port 58486 [preauth]
Sep 30 17:54:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:54:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:54:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:54:59 compute-1 sudo[202162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmklkbllahjgixvfkxvniydnmchvkpzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254899.4425852-2781-260655312801349/AnsiballZ_stat.py'
Sep 30 17:54:59 compute-1 sudo[202162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:54:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:54:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:54:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:54:59.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:00 compute-1 python3.9[202164]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:00 compute-1 sudo[202162]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:00 compute-1 unix_chkpwd[202189]: password check failed for user (root)
Sep 30 17:55:00 compute-1 sshd-session[202011]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 17:55:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:00 compute-1 sudo[202287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnncoxdhfdmfunxvrqvlorfoiwlsgdbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254899.4425852-2781-260655312801349/AnsiballZ_copy.py'
Sep 30 17:55:00 compute-1 sudo[202287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:00 compute-1 ceph-mon[75484]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:55:00 compute-1 python3.9[202289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254899.4425852-2781-260655312801349/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:00 compute-1 sudo[202287]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:01 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d0002240 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:01 compute-1 sudo[202440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahljsdsjwrevsxlizstdcywpnengxngd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254900.9066577-2811-181331098394767/AnsiballZ_stat.py'
Sep 30 17:55:01 compute-1 sudo[202440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:01 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:01 compute-1 python3.9[202442]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:01 compute-1 sudo[202440]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:01.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:01.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:01 compute-1 sudo[202563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvvuazcakfjuqdovowlbrxiinydoamp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254900.9066577-2811-181331098394767/AnsiballZ_copy.py'
Sep 30 17:55:01 compute-1 sudo[202563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:02 compute-1 python3.9[202565]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254900.9066577-2811-181331098394767/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:02 compute-1 sudo[202563]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:02 compute-1 unix_chkpwd[202637]: password check failed for user (root)
Sep 30 17:55:02 compute-1 sshd-session[201987]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94  user=root
Sep 30 17:55:02 compute-1 sshd-session[202011]: Failed password for root from 194.107.115.65 port 27630 ssh2
Sep 30 17:55:02 compute-1 ceph-mon[75484]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:02 compute-1 sudo[202718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulvfpxkzfjnirnxaytkgazukstmfybpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254902.3994055-2841-252072014657346/AnsiballZ_systemd.py'
Sep 30 17:55:02 compute-1 sudo[202718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:03 compute-1 sshd-session[202011]: Received disconnect from 194.107.115.65 port 27630:11: Bye Bye [preauth]
Sep 30 17:55:03 compute-1 sshd-session[202011]: Disconnected from authenticating user root 194.107.115.65 port 27630 [preauth]
Sep 30 17:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:03 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:03 compute-1 python3.9[202720]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:55:03 compute-1 systemd[1]: Reloading.
Sep 30 17:55:03 compute-1 systemd-rc-local-generator[202748]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:03 compute-1 systemd-sysv-generator[202753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:03 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40041d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:03 compute-1 sshd-session[202757]: Invalid user colin from 107.172.146.104 port 45260
Sep 30 17:55:03 compute-1 sshd-session[202757]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:55:03 compute-1 sshd-session[202757]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:55:03 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Sep 30 17:55:03 compute-1 sudo[202718]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:03.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:03 compute-1 sudo[202810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:55:03 compute-1 sudo[202810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:55:03 compute-1 sudo[202810]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:04 compute-1 sudo[202938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntcqbppctrijzjweuiwfsmimnvvocadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254903.7680993-2857-152049231923229/AnsiballZ_systemd.py'
Sep 30 17:55:04 compute-1 sudo[202938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:04 compute-1 sshd-session[201987]: Failed password for root from 113.249.93.94 port 59280 ssh2
Sep 30 17:55:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:04 compute-1 python3.9[202940]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 17:55:04 compute-1 systemd[1]: Reloading.
Sep 30 17:55:04 compute-1 systemd-rc-local-generator[202969]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:04 compute-1 systemd-sysv-generator[202974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:04 compute-1 ceph-mon[75484]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:04 compute-1 systemd[1]: Reloading.
Sep 30 17:55:05 compute-1 systemd-rc-local-generator[203004]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:05 compute-1 systemd-sysv-generator[203010]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:05 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d00023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:05 compute-1 sudo[202938]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:05 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:05 compute-1 sshd-session[201987]: Received disconnect from 113.249.93.94 port 59280:11: Bye Bye [preauth]
Sep 30 17:55:05 compute-1 sshd-session[201987]: Disconnected from authenticating user root 113.249.93.94 port 59280 [preauth]
Sep 30 17:55:05 compute-1 sshd-session[202757]: Failed password for invalid user colin from 107.172.146.104 port 45260 ssh2
Sep 30 17:55:05 compute-1 sshd-session[144782]: Connection closed by 192.168.122.30 port 37688
Sep 30 17:55:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:05.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:05 compute-1 sshd-session[144779]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:55:05 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Sep 30 17:55:05 compute-1 systemd[1]: session-53.scope: Consumed 4min 5.288s CPU time.
Sep 30 17:55:05 compute-1 systemd-logind[789]: Session 53 logged out. Waiting for processes to exit.
Sep 30 17:55:05 compute-1 systemd-logind[789]: Removed session 53.
Sep 30 17:55:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:05.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:06 compute-1 ceph-mon[75484]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:06 compute-1 sshd-session[202757]: Received disconnect from 107.172.146.104 port 45260:11: Bye Bye [preauth]
Sep 30 17:55:06 compute-1 sshd-session[202757]: Disconnected from invalid user colin 107.172.146.104 port 45260 [preauth]
Sep 30 17:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:07 compute-1 unix_chkpwd[203042]: password check failed for user (root)
Sep 30 17:55:07 compute-1 sshd-session[202978]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40041d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:55:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:07.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:07.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:08 compute-1 ceph-mon[75484]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:08 compute-1 sshd-session[202978]: Failed password for root from 192.210.160.141 port 56268 ssh2
Sep 30 17:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:09 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d0009990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:09 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:09.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:09.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:10 compute-1 sshd-session[202978]: Connection closed by authenticating user root 192.210.160.141 port 56268 [preauth]
Sep 30 17:55:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:10 compute-1 ceph-mon[75484]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:11 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:11 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:11 compute-1 sshd-session[203047]: Accepted publickey for zuul from 192.168.122.30 port 44898 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:55:11 compute-1 systemd-logind[789]: New session 54 of user zuul.
Sep 30 17:55:11 compute-1 systemd[1]: Started Session 54 of User zuul.
Sep 30 17:55:11 compute-1 sshd-session[203047]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:55:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:11.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:11.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:12 compute-1 python3.9[203201]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:55:12 compute-1 ceph-mon[75484]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d0009990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:13.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:13 compute-1 sudo[203356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eenjzhdqxuthmlemgpxbhijbyaofvpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254913.3268805-49-122200040265399/AnsiballZ_file.py'
Sep 30 17:55:13 compute-1 sudo[203356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:13.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:14 compute-1 python3.9[203358]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:14 compute-1 sudo[203356]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:14 compute-1 sudo[203509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfumjyutxqfhckqmijgnxtbttfzitozc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254914.2774754-49-22272233055138/AnsiballZ_file.py'
Sep 30 17:55:14 compute-1 sudo[203509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:14 compute-1 ceph-mon[75484]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:14 compute-1 python3.9[203511]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:14 compute-1 sudo[203509]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a40041d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:15 compute-1 sudo[203663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnfnpyihqjozwezocbombclbpalfcada ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254914.9476302-49-234958000461909/AnsiballZ_file.py'
Sep 30 17:55:15 compute-1 sudo[203663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:15 compute-1 python3.9[203665]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:15 compute-1 sudo[203663]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:15.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:15.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:16 compute-1 sudo[203816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erggrdencjcmqsyarffrrjqfxmubfyas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254915.7229466-49-143523733902391/AnsiballZ_file.py'
Sep 30 17:55:16 compute-1 sudo[203816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:16 compute-1 python3.9[203818]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 17:55:16 compute-1 sudo[203816]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:16 compute-1 sudo[203969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onhuzdwvrujyeseynwtvbadzissfldmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254916.4307637-49-239181673848401/AnsiballZ_file.py'
Sep 30 17:55:16 compute-1 sudo[203969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:16 compute-1 ceph-mon[75484]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:16 compute-1 python3.9[203971]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:16 compute-1 sudo[203969]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:17 compute-1 podman[204053]: 2025-09-30 17:55:17.616384489 +0000 UTC m=+0.157286751 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 17:55:17 compute-1 sudo[204148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbpahwjsweuqpzymzkjggxyvgffojcsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254917.1895366-121-93948099425421/AnsiballZ_stat.py'
Sep 30 17:55:17 compute-1 sudo[204148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:17.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:17.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:17 compute-1 python3.9[204150]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:55:17 compute-1 sudo[204148]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:18 compute-1 ceph-mon[75484]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:18 compute-1 sudo[204304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txiucmckghkdsrpkablisuehjvullwdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254918.1625-137-226988109271870/AnsiballZ_systemd.py'
Sep 30 17:55:18 compute-1 sudo[204304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:19 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:19 compute-1 python3.9[204306]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:55:19 compute-1 systemd[1]: Reloading.
Sep 30 17:55:19 compute-1 systemd-rc-local-generator[204336]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:19 compute-1 systemd-sysv-generator[204339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:19 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:19 compute-1 sudo[204304]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:19.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:19.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:20 compute-1 sudo[204494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxtqsyctleqhisybyaglyzoudgabxgjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254919.7961123-153-190712343032815/AnsiballZ_service_facts.py'
Sep 30 17:55:20 compute-1 sudo[204494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:20 compute-1 python3.9[204496]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:55:20 compute-1 ceph-mon[75484]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:21 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:21 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:21 compute-1 network[204514]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:55:21 compute-1 network[204515]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:55:21 compute-1 network[204516]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:55:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:21.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:21.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:22 compute-1 ceph-mon[75484]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:55:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:23 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:23 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:23.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:23.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:23 compute-1 sudo[204573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:55:23 compute-1 sudo[204573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:55:23 compute-1 sudo[204573]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:24 compute-1 ceph-mon[75484]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:25 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:25 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:25.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:25.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:26 compute-1 ceph-mon[75484]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:26 compute-1 sudo[204494]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:27 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4002340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:27 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:27 compute-1 sudo[204819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vppswmxvtqsbfjpprmajggumawkoatto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254927.150873-169-179824932821251/AnsiballZ_systemd.py'
Sep 30 17:55:27 compute-1 sudo[204819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:27.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:27 compute-1 python3.9[204821]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:55:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:27.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:27 compute-1 systemd[1]: Reloading.
Sep 30 17:55:28 compute-1 podman[204823]: 2025-09-30 17:55:28.006113731 +0000 UTC m=+0.120994048 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 17:55:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:28 compute-1 systemd-rc-local-generator[204872]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:28 compute-1 systemd-sysv-generator[204875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:28 compute-1 sudo[204819]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:28 compute-1 ceph-mon[75484]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:29 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac001fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:29 compute-1 python3.9[205034]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:29 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:29.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:29.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:29 compute-1 sudo[205187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmsneyvorujouawjwnxmuznuwhgobskb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254929.4170482-203-279246003541432/AnsiballZ_podman_container.py'
Sep 30 17:55:29 compute-1 unix_chkpwd[205189]: password check failed for user (root)
Sep 30 17:55:29 compute-1 sudo[205187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:29 compute-1 sshd-session[204930]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 17:55:30 compute-1 sshd-session[205035]: Invalid user jerry from 216.10.242.161 port 57190
Sep 30 17:55:30 compute-1 sshd-session[205035]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:55:30 compute-1 sshd-session[205035]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 17:55:30 compute-1 python3.9[205191]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 17:55:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:30 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:55:30 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:55:30 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:55:30 compute-1 podman[205203]: 2025-09-30 17:55:30.784715938 +0000 UTC m=+0.490534516 image pull f8ff303843ab104c2f5f56920f311c0b22efd49dc54152d8e2ede3a7218e9091 38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 17:55:30 compute-1 unix_chkpwd[205259]: password check failed for user (root)
Sep 30 17:55:30 compute-1 sshd-session[204905]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:55:30 compute-1 ceph-mon[75484]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:30 compute-1 podman[205265]: 2025-09-30 17:55:30.95646203 +0000 UTC m=+0.056339477 container create aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 17:55:30 compute-1 NetworkManager[45549]: <info>  [1759254930.9840] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/22)
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 17:55:31 compute-1 kernel: veth0: entered allmulticast mode
Sep 30 17:55:31 compute-1 kernel: veth0: entered promiscuous mode
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered forwarding state
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0140] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0165] device (veth0): carrier: link connected
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0169] device (podman0): carrier: link connected
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:30.931548685 +0000 UTC m=+0.031426152 image pull f8ff303843ab104c2f5f56920f311c0b22efd49dc54152d8e2ede3a7218e9091 38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 17:55:31 compute-1 systemd-udevd[205301]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:55:31 compute-1 systemd-udevd[205304]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0609] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0623] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0636] device (podman0): Activation: starting connection 'podman0' (a91b2e9a-ff99-4548-97e2-9b677bdf35b3)
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0637] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0644] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0648] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.0651] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:31 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4002340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:31 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 17:55:31 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.1015] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.1018] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.1025] device (podman0): Activation: successful, device activated.
Sep 30 17:55:31 compute-1 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Sep 30 17:55:31 compute-1 systemd[1]: Started libpod-conmon-aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280.scope.
Sep 30 17:55:31 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:31.367431461 +0000 UTC m=+0.467308948 container init aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:31.380982668 +0000 UTC m=+0.480860145 container start aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 17:55:31 compute-1 iscsid_config[205422]: iqn.1994-05.com.redhat:d7bbbc2a579e
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:31.384907234 +0000 UTC m=+0.484784711 container attach aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:55:31 compute-1 systemd[1]: libpod-aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280.scope: Deactivated successfully.
Sep 30 17:55:31 compute-1 conmon[205422]: conmon aa991cb0709a00d44ed1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280.scope/container/memory.events
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:31.387653319 +0000 UTC m=+0.487530826 container died aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 17:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:31 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c80034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 17:55:31 compute-1 kernel: veth0 (unregistering): left allmulticast mode
Sep 30 17:55:31 compute-1 kernel: veth0 (unregistering): left promiscuous mode
Sep 30 17:55:31 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 17:55:31 compute-1 NetworkManager[45549]: <info>  [1759254931.4614] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 17:55:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:31.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:31 compute-1 systemd[1]: run-netns-netns\x2db20cc3f1\x2d9ab8\x2d2dc7\x2da6e0\x2d72eeb02b3755.mount: Deactivated successfully.
Sep 30 17:55:31 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280-userdata-shm.mount: Deactivated successfully.
Sep 30 17:55:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-8457806b6b0150120b7c6539358cac1bd95b263ae8838d2cea51550a7b707c13-merged.mount: Deactivated successfully.
Sep 30 17:55:31 compute-1 podman[205265]: 2025-09-30 17:55:31.862090119 +0000 UTC m=+0.961967566 container remove aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280 (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 17:55:31 compute-1 python3.9[205191]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Sep 30 17:55:31 compute-1 systemd[1]: libpod-conmon-aa991cb0709a00d44ed133530055c4bc09227100b1865f97869c7108c9a02280.scope: Deactivated successfully.
Sep 30 17:55:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:31.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:31 compute-1 python3.9[205191]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Sep 30 17:55:32 compute-1 sudo[205187]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:32 compute-1 sshd-session[204930]: Failed password for root from 14.225.167.110 port 51496 ssh2
Sep 30 17:55:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:32 compute-1 sshd-session[205035]: Failed password for invalid user jerry from 216.10.242.161 port 57190 ssh2
Sep 30 17:55:32 compute-1 sudo[205663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhudcnujpyquymjgdsvezezrcjgzhuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254932.1968002-219-37202607869261/AnsiballZ_stat.py'
Sep 30 17:55:32 compute-1 sudo[205663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:32 compute-1 python3.9[205665]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:32 compute-1 sudo[205663]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:32 compute-1 sshd-session[204930]: Received disconnect from 14.225.167.110 port 51496:11: Bye Bye [preauth]
Sep 30 17:55:32 compute-1 sshd-session[204930]: Disconnected from authenticating user root 14.225.167.110 port 51496 [preauth]
Sep 30 17:55:32 compute-1 ceph-mon[75484]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:33 compute-1 sshd-session[204905]: Failed password for root from 192.210.160.141 port 39466 ssh2
Sep 30 17:55:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:33 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac001fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:33 compute-1 sudo[205787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dycmwunvhcmcqzubbvylxacqaogovzhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254932.1968002-219-37202607869261/AnsiballZ_copy.py'
Sep 30 17:55:33 compute-1 sudo[205787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:33 compute-1 python3.9[205789]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254932.1968002-219-37202607869261/.source.iscsi _original_basename=.c4e73giz follow=False checksum=1c9b84974c322224b0cd72c71eb9a20078fe609a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:33 compute-1 sudo[205787]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:33 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:33.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:33.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:33 compute-1 sshd-session[204905]: Connection closed by authenticating user root 192.210.160.141 port 39466 [preauth]
Sep 30 17:55:33 compute-1 sudo[205939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvykbcdiclzrgheabmdmpaqwpzwzbvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254933.5959184-249-176281909779884/AnsiballZ_file.py'
Sep 30 17:55:33 compute-1 sudo[205939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:34 compute-1 python3.9[205941]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:34 compute-1 sudo[205939]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:34 compute-1 sshd-session[205035]: Received disconnect from 216.10.242.161 port 57190:11: Bye Bye [preauth]
Sep 30 17:55:34 compute-1 sshd-session[205035]: Disconnected from invalid user jerry 216.10.242.161 port 57190 [preauth]
Sep 30 17:55:34 compute-1 python3.9[206093]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:55:34 compute-1 ceph-mon[75484]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:35 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c4002340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:35 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:35 compute-1 sudo[206245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrepdsmuofhfpgcidvqtbwunztzxxnzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254935.080921-283-20487115660062/AnsiballZ_lineinfile.py'
Sep 30 17:55:35 compute-1 sudo[206245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:35 compute-1 python3.9[206247]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:35 compute-1 sudo[206245]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:35.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:36 compute-1 sudo[206398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neecgftfjgbhmuwtwmtrrxotghaujrmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254936.0717294-301-184868225175990/AnsiballZ_file.py'
Sep 30 17:55:36 compute-1 sudo[206398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:36 compute-1 python3.9[206400]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:36 compute-1 sudo[206398]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:36 compute-1 ceph-mon[75484]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:37 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac001fb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:37 compute-1 sudo[206551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jorwrrssyupofzazltlhrqyjqtorlxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254936.904987-317-225993760782656/AnsiballZ_stat.py'
Sep 30 17:55:37 compute-1 sudo[206551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:37 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57d000a6a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:37 compute-1 python3.9[206553]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:37 compute-1 sudo[206551]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:37 compute-1 sudo[206629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlqekbzaebpoljdfvstepqckhyvsqhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254936.904987-317-225993760782656/AnsiballZ_file.py'
Sep 30 17:55:37 compute-1 sudo[206629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 17:55:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 17:55:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:37.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:37 compute-1 python3.9[206631]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:38 compute-1 sudo[206629]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.024919) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938024973, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 679, "num_deletes": 251, "total_data_size": 1316600, "memory_usage": 1338240, "flush_reason": "Manual Compaction"}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938031179, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 866515, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17619, "largest_seqno": 18293, "table_properties": {"data_size": 863146, "index_size": 1277, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7608, "raw_average_key_size": 19, "raw_value_size": 856457, "raw_average_value_size": 2151, "num_data_blocks": 57, "num_entries": 398, "num_filter_entries": 398, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254890, "oldest_key_time": 1759254890, "file_creation_time": 1759254938, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 6294 microseconds, and 3622 cpu microseconds.
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.031222) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 866515 bytes OK
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.031241) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.033212) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.033230) EVENT_LOG_v1 {"time_micros": 1759254938033225, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.033250) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1312904, prev total WAL file size 1312904, number of live WAL files 2.
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.033945) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(846KB)], [30(10MB)]
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938034041, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 12169475, "oldest_snapshot_seqno": -1}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4612 keys, 10124734 bytes, temperature: kUnknown
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938081335, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 10124734, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10093670, "index_size": 18410, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11589, "raw_key_size": 116206, "raw_average_key_size": 25, "raw_value_size": 10009666, "raw_average_value_size": 2170, "num_data_blocks": 773, "num_entries": 4612, "num_filter_entries": 4612, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254938, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.081586) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 10124734 bytes
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.083182) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.9 rd, 213.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 10.8 +0.0 blob) out(9.7 +0.0 blob), read-write-amplify(25.7) write-amplify(11.7) OK, records in: 5126, records dropped: 514 output_compression: NoCompression
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.083204) EVENT_LOG_v1 {"time_micros": 1759254938083193, "job": 16, "event": "compaction_finished", "compaction_time_micros": 47379, "compaction_time_cpu_micros": 18641, "output_level": 6, "num_output_files": 1, "total_output_size": 10124734, "num_input_records": 5126, "num_output_records": 4612, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938083484, "job": 16, "event": "table_file_deletion", "file_number": 32}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254938085784, "job": 16, "event": "table_file_deletion", "file_number": 30}
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.033851) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.085922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.085931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.085933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.085937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:55:38.085940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:55:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:38 compute-1 sudo[206782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkljevevslrhilejfyrrxmspqksqkfmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254938.1858797-317-92920504728450/AnsiballZ_stat.py'
Sep 30 17:55:38 compute-1 sudo[206782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:38 compute-1 python3.9[206784]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:38 compute-1 sudo[206782]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:39 compute-1 sudo[206861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atovlmsgwbyuwqqvkzhhowinzlljkodg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254938.1858797-317-92920504728450/AnsiballZ_file.py'
Sep 30 17:55:39 compute-1 sudo[206861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:39 compute-1 ceph-mon[75484]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:39 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c40037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:39 compute-1 python3.9[206863]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:39 compute-1 sudo[206861]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:39 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:39 compute-1 sudo[207013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkyngdraxpkktlwhdsfnpncsnlzolvzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254939.442437-363-129892391885314/AnsiballZ_file.py'
Sep 30 17:55:39 compute-1 sudo[207013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:39.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:39 compute-1 python3.9[207015]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:40 compute-1 sudo[207013]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:40 compute-1 ceph-mon[75484]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:40 compute-1 sudo[207166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krsoqzvihdsxvwuqynxaohlfgxeotagf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254940.2088456-379-205860074212626/AnsiballZ_stat.py'
Sep 30 17:55:40 compute-1 sudo[207166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:40 compute-1 python3.9[207168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:40 compute-1 sudo[207166]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:41 compute-1 sudo[207245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcivaahytgdlexavmployuvgqntdmnzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254940.2088456-379-205860074212626/AnsiballZ_file.py'
Sep 30 17:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:41 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:41 compute-1 sudo[207245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:41 compute-1 python3.9[207247]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:41 compute-1 sudo[207245]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:41 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:41 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 17:55:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:41.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:41.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:41 compute-1 sudo[207397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyvdigyqccmkyprarowrialadinyhrmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254941.5648086-403-243869014289755/AnsiballZ_stat.py'
Sep 30 17:55:41 compute-1 sudo[207397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:42 compute-1 python3.9[207400]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:42 compute-1 sudo[207397]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:42 compute-1 sudo[207476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhyamvltuwuqzglddgpbdngehkhgcmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254941.5648086-403-243869014289755/AnsiballZ_file.py'
Sep 30 17:55:42 compute-1 sudo[207476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:42 compute-1 ceph-mon[75484]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:42 compute-1 python3.9[207478]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:42 compute-1 sudo[207476]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:43 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c40037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:43 compute-1 sudo[207629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nztlasztvdmdxkzcfoafdfdwohdaywaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254942.950901-427-176539739117107/AnsiballZ_systemd.py'
Sep 30 17:55:43 compute-1 sudo[207629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:43 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:43 compute-1 python3.9[207631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:55:43 compute-1 systemd[1]: Reloading.
Sep 30 17:55:43 compute-1 systemd-rc-local-generator[207653]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:43 compute-1 systemd-sysv-generator[207659]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:43.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:43.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:44 compute-1 sudo[207629]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:44 compute-1 sudo[207669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:55:44 compute-1 sudo[207669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:55:44 compute-1 sudo[207669]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:44 compute-1 sudo[207843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pccamvhxdledrnjtlurntmxvumykgjcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254944.24667-443-17442514967461/AnsiballZ_stat.py'
Sep 30 17:55:44 compute-1 sudo[207843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:44 compute-1 ceph-mon[75484]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:44 compute-1 python3.9[207845]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:44 compute-1 sudo[207843]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:45 compute-1 sudo[207922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcarwusiyxtcxmhyxdhnankxutnnaqgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254944.24667-443-17442514967461/AnsiballZ_file.py'
Sep 30 17:55:45 compute-1 sudo[207922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:45 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:45 compute-1 python3.9[207924]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:45 compute-1 sudo[207922]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:45 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:45.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:45 compute-1 sudo[208074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utwqowgsnethtqyulxdonlsryyolpeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254945.5078251-467-234032686566499/AnsiballZ_stat.py'
Sep 30 17:55:45 compute-1 sudo[208074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:45.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:46 compute-1 python3.9[208076]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:46 compute-1 sudo[208074]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:46 compute-1 sudo[208153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfiktjramtagshrfnfkvamlpmuynahh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254945.5078251-467-234032686566499/AnsiballZ_file.py'
Sep 30 17:55:46 compute-1 sudo[208153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:46 compute-1 python3.9[208155]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:46 compute-1 sudo[208153]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:46 compute-1 ceph-mon[75484]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:47 compute-1 sudo[208306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmbdknchakzxyssexgfrfkpmsylrjydk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254946.7484858-492-118721468943222/AnsiballZ_systemd.py'
Sep 30 17:55:47 compute-1 sudo[208306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:47 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c40037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:47 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:47 compute-1 python3.9[208308]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:55:47 compute-1 systemd[1]: Reloading.
Sep 30 17:55:47 compute-1 systemd-rc-local-generator[208333]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:55:47 compute-1 systemd-sysv-generator[208339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:55:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:47.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:47 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:55:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:47.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:47 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:55:47 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:55:47 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:55:47 compute-1 podman[208346]: 2025-09-30 17:55:47.966695189 +0000 UTC m=+0.112753245 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 17:55:47 compute-1 sudo[208306]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:48 compute-1 sudo[208527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwvaojhaeuldkuqwhjizdzvuiccsndkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254948.289564-511-228885376452517/AnsiballZ_file.py'
Sep 30 17:55:48 compute-1 sudo[208527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:48 compute-1 python3.9[208530]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:48 compute-1 sudo[208527]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:48 compute-1 ceph-mon[75484]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:49 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:49 compute-1 sudo[208681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujnbxaemmfkabrqjziotsymcwvjfxzmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254949.0106719-527-274834370749770/AnsiballZ_stat.py'
Sep 30 17:55:49 compute-1 sudo[208681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:49 compute-1 python3.9[208683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:49 compute-1 sudo[208681]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:49 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 17:55:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:49.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 17:55:49 compute-1 sudo[208804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkoixxvnkvwpuwlkumqkpcioaztzxyad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254949.0106719-527-274834370749770/AnsiballZ_copy.py'
Sep 30 17:55:49 compute-1 sudo[208804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:49.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:50 compute-1 python3.9[208806]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759254949.0106719-527-274834370749770/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:50 compute-1 sudo[208804]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:50 compute-1 sudo[208958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peptrhhfyrtbsyrwmxhfguwtxuavxcot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254950.5013213-561-140480188847924/AnsiballZ_file.py'
Sep 30 17:55:50 compute-1 sudo[208958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:50 compute-1 ceph-mon[75484]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:51 compute-1 python3.9[208960]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:55:51 compute-1 sudo[208958]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:51 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002830 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:51 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:51 compute-1 sudo[209110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmsjeuobeknjknacgymekcjnhgczazik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254951.2946675-577-161090866714848/AnsiballZ_stat.py'
Sep 30 17:55:51 compute-1 sudo[209110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:51.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:51 compute-1 python3.9[209112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:55:51 compute-1 sudo[209110]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:51.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:52 compute-1 ceph-mon[75484]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:52 compute-1 sudo[209235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shnnqxyomhaasixrdysivkwoeyihzmyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254951.2946675-577-161090866714848/AnsiballZ_copy.py'
Sep 30 17:55:52 compute-1 sudo[209235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:52 compute-1 python3.9[209237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254951.2946675-577-161090866714848/.source.json _original_basename=.xgpjd9nh follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:52 compute-1 sudo[209235]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:52 compute-1 sudo[209389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjtebdenncesewqihuxovftlzfqdejw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254952.6135993-607-119668513036497/AnsiballZ_file.py'
Sep 30 17:55:52 compute-1 sudo[209389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:53 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:53 compute-1 python3.9[209391]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:55:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:55:53 compute-1 sudo[209389]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:53 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:53 compute-1 sudo[209541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwtlciyoorvpzdfltqgrohpycwvwbut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254953.4025908-623-194824212105732/AnsiballZ_stat.py'
Sep 30 17:55:53 compute-1 sudo[209541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:53 compute-1 sudo[209541]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:53.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:54 compute-1 unix_chkpwd[209571]: password check failed for user (root)
Sep 30 17:55:54 compute-1 sshd-session[209132]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:55:54 compute-1 ceph-mon[75484]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:55:54.296 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:55:54.297 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:55:54.297 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:55:54 compute-1 sudo[209667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yovvludlczvbrxpjdxpzltnbslitugsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254953.4025908-623-194824212105732/AnsiballZ_copy.py'
Sep 30 17:55:54 compute-1 sudo[209667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:54 compute-1 sudo[209667]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:55 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002830 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:55 compute-1 sudo[209778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:55:55 compute-1 sudo[209778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:55:55 compute-1 sudo[209778]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:55 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:55 compute-1 sudo[209858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqlobwssfbelmlstpjztzjxsylpqmzuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254954.9754813-657-166882483136958/AnsiballZ_container_config_data.py'
Sep 30 17:55:55 compute-1 sudo[209858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:55 compute-1 sudo[209828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:55:55 compute-1 sudo[209828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:55:55 compute-1 python3.9[209870]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Sep 30 17:55:55 compute-1 sudo[209858]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:55.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:55.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:55:56 compute-1 sudo[209828]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:56 compute-1 sshd-session[209132]: Failed password for root from 192.210.160.141 port 43388 ssh2
Sep 30 17:55:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:56 compute-1 sudo[210059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jppkinrmhwnyklxwpluckvblaekuqcvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254955.9839637-675-112922258986235/AnsiballZ_container_config_hash.py'
Sep 30 17:55:56 compute-1 sudo[210059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:56 compute-1 ceph-mon[75484]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:55:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:55:56 compute-1 python3.9[210061]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:55:56 compute-1 sudo[210059]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:56 compute-1 sshd-session[209974]: Invalid user server from 84.51.43.58 port 41239
Sep 30 17:55:56 compute-1 sshd-session[209974]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:55:56 compute-1 sshd-session[209974]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 17:55:57 compute-1 sshd-session[209132]: Connection closed by authenticating user root 192.210.160.141 port 43388 [preauth]
Sep 30 17:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:57 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:57 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:57 compute-1 sudo[210212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlinnxfvhtkscmhcuipjhwarfgaqwvou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254956.9916172-693-23025491222765/AnsiballZ_podman_container_info.py'
Sep 30 17:55:57 compute-1 sudo[210212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:57 compute-1 python3.9[210214]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 17:55:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:57.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:57 compute-1 sudo[210212]: pam_unix(sudo:session): session closed for user root
Sep 30 17:55:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:57.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:58 compute-1 unix_chkpwd[210266]: password check failed for user (root)
Sep 30 17:55:58 compute-1 sshd-session[209881]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94  user=root
Sep 30 17:55:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:55:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:58 compute-1 podman[210267]: 2025-09-30 17:55:58.559735347 +0000 UTC m=+0.094750777 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 17:55:58 compute-1 sshd-session[209974]: Failed password for invalid user server from 84.51.43.58 port 41239 ssh2
Sep 30 17:55:58 compute-1 ceph-mon[75484]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:59 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002830 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:59 compute-1 sudo[210413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbhbryxxccchrzetfvmabtptnyvwjqgc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759254958.820474-719-208439388863533/AnsiballZ_edpm_container_manage.py'
Sep 30 17:55:59 compute-1 sudo[210413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:55:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:55:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:55:59 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:55:59 compute-1 sshd-session[209974]: Received disconnect from 84.51.43.58 port 41239:11: Bye Bye [preauth]
Sep 30 17:55:59 compute-1 sshd-session[209974]: Disconnected from invalid user server 84.51.43.58 port 41239 [preauth]
Sep 30 17:55:59 compute-1 python3[210415]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:55:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:55:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:55:59.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:55:59 compute-1 podman[210452]: 2025-09-30 17:55:59.868238597 +0000 UTC m=+0.071029945 container create 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:55:59 compute-1 podman[210452]: 2025-09-30 17:55:59.824175473 +0000 UTC m=+0.026966881 image pull f8ff303843ab104c2f5f56920f311c0b22efd49dc54152d8e2ede3a7218e9091 38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 17:55:59 compute-1 python3[210415]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 17:55:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:55:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:55:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:55:59.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:00 compute-1 sudo[210413]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:00 compute-1 sshd-session[209881]: Failed password for root from 113.249.93.94 port 9174 ssh2
Sep 30 17:56:00 compute-1 ceph-mon[75484]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:56:01 compute-1 sudo[210642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aobwbgmqzxbolhvfjucijqiolunchjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254960.6740417-735-250653541639220/AnsiballZ_stat.py'
Sep 30 17:56:01 compute-1 sudo[210642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:01 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:01 compute-1 python3.9[210644]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:01 compute-1 sudo[210642]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:01 compute-1 sshd[170789]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 191328
Sep 30 17:56:01 compute-1 sshd-session[210645]: Invalid user flavia from 107.172.146.104 port 41118
Sep 30 17:56:01 compute-1 sshd-session[210645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:01 compute-1 sshd-session[210645]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:01 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:01 compute-1 sshd-session[209881]: Received disconnect from 113.249.93.94 port 9174:11: Bye Bye [preauth]
Sep 30 17:56:01 compute-1 sshd-session[209881]: Disconnected from authenticating user root 113.249.93.94 port 9174 [preauth]
Sep 30 17:56:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:01.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:01 compute-1 sudo[210800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmlkssjtjxchdeqdjzydgigrpiqmkdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254961.526753-753-150720058120664/AnsiballZ_file.py'
Sep 30 17:56:01 compute-1 sudo[210800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:02 compute-1 python3.9[210802]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:02 compute-1 sudo[210800]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:02 compute-1 sudo[210877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niwfvfmuyfocqhmyupvokacjfjkvxsoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254961.526753-753-150720058120664/AnsiballZ_stat.py'
Sep 30 17:56:02 compute-1 sudo[210877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:02 compute-1 unix_chkpwd[210880]: password check failed for user (root)
Sep 30 17:56:02 compute-1 sshd-session[210673]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 17:56:02 compute-1 python3.9[210879]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:02 compute-1 sudo[210877]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:02 compute-1 sudo[210909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:56:02 compute-1 sudo[210909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:56:02 compute-1 sudo[210909]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:02 compute-1 ceph-mon[75484]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:56:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:56:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:03 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002830 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:03 compute-1 sshd-session[210645]: Failed password for invalid user flavia from 107.172.146.104 port 41118 ssh2
Sep 30 17:56:03 compute-1 sudo[211055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpmncqoqjulfitqbzpnkdqfmbheoozcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254962.6966853-753-43247676665405/AnsiballZ_copy.py'
Sep 30 17:56:03 compute-1 sudo[211055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:03 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:03 compute-1 python3.9[211057]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759254962.6966853-753-43247676665405/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:03 compute-1 sudo[211055]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:03.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:03 compute-1 sudo[211131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slkdslcppglaaznidcdvwuwrrokosbbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254962.6966853-753-43247676665405/AnsiballZ_systemd.py'
Sep 30 17:56:03 compute-1 sudo[211131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:03.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:04 compute-1 sudo[211135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:56:04 compute-1 sudo[211135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:56:04 compute-1 sudo[211135]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:04 compute-1 python3.9[211133]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:56:04 compute-1 systemd[1]: Reloading.
Sep 30 17:56:04 compute-1 systemd-rc-local-generator[211186]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:56:04 compute-1 systemd-sysv-generator[211189]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:56:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:04 compute-1 sshd-session[210645]: Received disconnect from 107.172.146.104 port 41118:11: Bye Bye [preauth]
Sep 30 17:56:04 compute-1 sshd-session[210645]: Disconnected from invalid user flavia 107.172.146.104 port 41118 [preauth]
Sep 30 17:56:04 compute-1 sudo[211131]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:04 compute-1 sshd-session[210673]: Failed password for root from 175.126.165.170 port 41040 ssh2
Sep 30 17:56:04 compute-1 sudo[211268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuokdmodxlqqcayeuweratbmmyibiikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254962.6966853-753-43247676665405/AnsiballZ_systemd.py'
Sep 30 17:56:04 compute-1 sudo[211268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:04 compute-1 ceph-mon[75484]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:05 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:05 compute-1 python3.9[211270]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:56:05 compute-1 systemd[1]: Reloading.
Sep 30 17:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:05 compute-1 sshd-session[210673]: Received disconnect from 175.126.165.170 port 41040:11: Bye Bye [preauth]
Sep 30 17:56:05 compute-1 sshd-session[210673]: Disconnected from authenticating user root 175.126.165.170 port 41040 [preauth]
Sep 30 17:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:05 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:05 compute-1 systemd-sysv-generator[211306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:56:05 compute-1 systemd-rc-local-generator[211302]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:56:05 compute-1 systemd[1]: Starting iscsid container...
Sep 30 17:56:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:05.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:05 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:56:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:05.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89da0d19e73015d5b887bc004b8a7bee13a4af6c8ce8e2368adf2bd352c86e5c/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89da0d19e73015d5b887bc004b8a7bee13a4af6c8ce8e2368adf2bd352c86e5c/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89da0d19e73015d5b887bc004b8a7bee13a4af6c8ce8e2368adf2bd352c86e5c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:05 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff.
Sep 30 17:56:05 compute-1 podman[211311]: 2025-09-30 17:56:05.989462169 +0000 UTC m=+0.153548410 container init 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 17:56:06 compute-1 ceph-mon[75484]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:06 compute-1 iscsid[211328]: + sudo -E kolla_set_configs
Sep 30 17:56:06 compute-1 sudo[211335]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 17:56:06 compute-1 podman[211311]: 2025-09-30 17:56:06.033798479 +0000 UTC m=+0.197884730 container start 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 17:56:06 compute-1 podman[211311]: iscsid
Sep 30 17:56:06 compute-1 systemd[1]: Started iscsid container.
Sep 30 17:56:06 compute-1 systemd[1]: Created slice User Slice of UID 0.
Sep 30 17:56:06 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 17:56:06 compute-1 sudo[211268]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:06 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 17:56:06 compute-1 podman[211337]: 2025-09-30 17:56:06.117534147 +0000 UTC m=+0.072235217 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 17:56:06 compute-1 systemd[1]: Starting User Manager for UID 0...
Sep 30 17:56:06 compute-1 systemd[1]: 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff-360ca160a7859e9d.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 17:56:06 compute-1 systemd[1]: 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff-360ca160a7859e9d.service: Failed with result 'exit-code'.
Sep 30 17:56:06 compute-1 systemd[211359]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 17:56:06 compute-1 systemd[211359]: Queued start job for default target Main User Target.
Sep 30 17:56:06 compute-1 systemd[211359]: Created slice User Application Slice.
Sep 30 17:56:06 compute-1 systemd[211359]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 17:56:06 compute-1 systemd[211359]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 17:56:06 compute-1 systemd[211359]: Reached target Paths.
Sep 30 17:56:06 compute-1 systemd[211359]: Reached target Timers.
Sep 30 17:56:06 compute-1 systemd[211359]: Starting D-Bus User Message Bus Socket...
Sep 30 17:56:06 compute-1 systemd[211359]: Starting Create User's Volatile Files and Directories...
Sep 30 17:56:06 compute-1 systemd[211359]: Listening on D-Bus User Message Bus Socket.
Sep 30 17:56:06 compute-1 systemd[211359]: Reached target Sockets.
Sep 30 17:56:06 compute-1 systemd[211359]: Finished Create User's Volatile Files and Directories.
Sep 30 17:56:06 compute-1 systemd[211359]: Reached target Basic System.
Sep 30 17:56:06 compute-1 systemd[211359]: Reached target Main User Target.
Sep 30 17:56:06 compute-1 systemd[211359]: Startup finished in 191ms.
Sep 30 17:56:06 compute-1 systemd[1]: Started User Manager for UID 0.
Sep 30 17:56:06 compute-1 systemd[1]: Started Session c3 of User root.
Sep 30 17:56:06 compute-1 sudo[211335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:56:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:06 compute-1 iscsid[211328]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:56:06 compute-1 iscsid[211328]: INFO:__main__:Validating config file
Sep 30 17:56:06 compute-1 iscsid[211328]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:56:06 compute-1 iscsid[211328]: INFO:__main__:Writing out command to execute
Sep 30 17:56:06 compute-1 sudo[211335]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:06 compute-1 iscsid[211328]: ++ cat /run_command
Sep 30 17:56:06 compute-1 systemd[1]: session-c3.scope: Deactivated successfully.
Sep 30 17:56:06 compute-1 iscsid[211328]: + CMD='/usr/sbin/iscsid -f'
Sep 30 17:56:06 compute-1 iscsid[211328]: + ARGS=
Sep 30 17:56:06 compute-1 iscsid[211328]: + sudo kolla_copy_cacerts
Sep 30 17:56:06 compute-1 sudo[211460]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 17:56:06 compute-1 systemd[1]: Started Session c4 of User root.
Sep 30 17:56:06 compute-1 sudo[211460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:56:06 compute-1 sudo[211460]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:06 compute-1 iscsid[211328]: + [[ ! -n '' ]]
Sep 30 17:56:06 compute-1 iscsid[211328]: + . kolla_extend_start
Sep 30 17:56:06 compute-1 iscsid[211328]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Sep 30 17:56:06 compute-1 iscsid[211328]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Sep 30 17:56:06 compute-1 iscsid[211328]: Running command: '/usr/sbin/iscsid -f'
Sep 30 17:56:06 compute-1 iscsid[211328]: + umask 0022
Sep 30 17:56:06 compute-1 iscsid[211328]: + exec /usr/sbin/iscsid -f
Sep 30 17:56:06 compute-1 systemd[1]: session-c4.scope: Deactivated successfully.
Sep 30 17:56:06 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Sep 30 17:56:06 compute-1 python3.9[211536]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a0002830 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:07 compute-1 sudo[211686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjlybzpxmrzozqydylqsjpffpygkrxni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254967.0601985-827-183349913595407/AnsiballZ_file.py'
Sep 30 17:56:07 compute-1 sudo[211686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:07 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:07 compute-1 python3.9[211688]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:07 compute-1 sudo[211686]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:07.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:07.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:08 compute-1 ceph-mon[75484]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:08 compute-1 sudo[211839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csizwnixuigwidqixnpqvqhyrepsviau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254968.1997557-849-8571155319111/AnsiballZ_service_facts.py'
Sep 30 17:56:08 compute-1 sudo[211839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:08 compute-1 python3.9[211841]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:56:08 compute-1 network[211859]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:56:08 compute-1 network[211860]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:56:08 compute-1 network[211861]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:09 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:09 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:09.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:09.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:10 compute-1 sshd-session[211867]: Invalid user li from 194.107.115.65 port 52106
Sep 30 17:56:10 compute-1 sshd-session[211867]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:10 compute-1 sshd-session[211867]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 17:56:10 compute-1 ceph-mon[75484]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:11 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a00036c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:11 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:11.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:11.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:12 compute-1 ceph-mon[75484]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:12 compute-1 sshd-session[211867]: Failed password for invalid user li from 194.107.115.65 port 52106 ssh2
Sep 30 17:56:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:13 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:13 compute-1 sshd-session[211867]: Received disconnect from 194.107.115.65 port 52106:11: Bye Bye [preauth]
Sep 30 17:56:13 compute-1 sshd-session[211867]: Disconnected from invalid user li 194.107.115.65 port 52106 [preauth]
Sep 30 17:56:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:13.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:13 compute-1 sudo[211839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:13.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:14 compute-1 sudo[212142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkruqfcooeugzogcivdeerysmduwdko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254974.4317935-869-127542731283708/AnsiballZ_file.py'
Sep 30 17:56:14 compute-1 sudo[212142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:14 compute-1 ceph-mon[75484]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:56:14 compute-1 python3.9[212144]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 17:56:14 compute-1 sudo[212142]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a00036c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:15 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57ac004360 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:15 compute-1 sudo[212294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnuvdyufnsllihcvtegifqljregdbdrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254975.1814964-885-96070565269155/AnsiballZ_modprobe.py'
Sep 30 17:56:15 compute-1 sudo[212294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:15.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:15 compute-1 python3.9[212296]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Sep 30 17:56:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:15.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:15 compute-1 sudo[212294]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:16 compute-1 sudo[212453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzaihilqwrnchjoyxsqrfwezeaqwwzmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254976.1838212-901-261873631033629/AnsiballZ_stat.py'
Sep 30 17:56:16 compute-1 sudo[212453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:16 compute-1 systemd[1]: Stopping User Manager for UID 0...
Sep 30 17:56:16 compute-1 systemd[211359]: Activating special unit Exit the Session...
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped target Main User Target.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped target Basic System.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped target Paths.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped target Sockets.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped target Timers.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 17:56:16 compute-1 systemd[211359]: Closed D-Bus User Message Bus Socket.
Sep 30 17:56:16 compute-1 systemd[211359]: Stopped Create User's Volatile Files and Directories.
Sep 30 17:56:16 compute-1 systemd[211359]: Removed slice User Application Slice.
Sep 30 17:56:16 compute-1 systemd[211359]: Reached target Shutdown.
Sep 30 17:56:16 compute-1 systemd[211359]: Finished Exit the Session.
Sep 30 17:56:16 compute-1 systemd[211359]: Reached target Exit the Session.
Sep 30 17:56:16 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 17:56:16 compute-1 systemd[1]: Stopped User Manager for UID 0.
Sep 30 17:56:16 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 17:56:16 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 17:56:16 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 17:56:16 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 17:56:16 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 17:56:16 compute-1 python3.9[212455]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:16 compute-1 ceph-mon[75484]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:16 compute-1 sudo[212453]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a4003040 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:17 compute-1 sudo[212578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshoeaobanlkxaejihnpacjhiplbamho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254976.1838212-901-261873631033629/AnsiballZ_copy.py'
Sep 30 17:56:17 compute-1 sudo[212578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:17 compute-1 python3.9[212580]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254976.1838212-901-261873631033629/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:17 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57c8004990 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:17 compute-1 sudo[212578]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:17 compute-1 sshd-session[212297]: Invalid user rust from 192.210.160.141 port 58392
Sep 30 17:56:17 compute-1 sshd-session[212297]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:17 compute-1 sshd-session[212297]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 17:56:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:17.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:17.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:18 compute-1 sudo[212741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzigtazaxcrixdaxozqupykkowxqehak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254977.765485-933-28124591966450/AnsiballZ_lineinfile.py'
Sep 30 17:56:18 compute-1 sudo[212741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:18 compute-1 podman[212705]: 2025-09-30 17:56:18.24815114 +0000 UTC m=+0.152850205 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 17:56:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:18 compute-1 python3.9[212749]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:18 compute-1 sudo[212741]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:18 compute-1 ceph-mon[75484]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:19 compute-1 sudo[212907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taabnfqldrhgoibnupidtnmplhboavdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254978.6800206-949-64388307850098/AnsiballZ_systemd.py'
Sep 30 17:56:19 compute-1 sudo[212907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:19 compute-1 kernel: ganesha.nfsd[208611]: segfault at 50 ip 00007f587efc132e sp 00007f583effc210 error 4 in libntirpc.so.5.8[7f587efa6000+2c000] likely on CPU 1 (core 0, socket 1)
Sep 30 17:56:19 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 17:56:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[191496]: 30/09/2025 17:56:19 : epoch 68dc1938 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57a00036c0 fd 47 proxy ignored for local
Sep 30 17:56:19 compute-1 systemd[1]: Started Process Core Dump (PID 212910/UID 0).
Sep 30 17:56:19 compute-1 python3.9[212909]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:56:19 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 17:56:19 compute-1 systemd[1]: Stopped Load Kernel Modules.
Sep 30 17:56:19 compute-1 systemd[1]: Stopping Load Kernel Modules...
Sep 30 17:56:19 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 17:56:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:19 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 17:56:19 compute-1 sudo[212907]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:19.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:20 compute-1 sudo[213066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjcejgdidvvvwkkngwrpiduyflymtwbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254979.6509442-965-202036138551692/AnsiballZ_file.py'
Sep 30 17:56:20 compute-1 sudo[213066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:20 compute-1 sshd-session[212297]: Failed password for invalid user rust from 192.210.160.141 port 58392 ssh2
Sep 30 17:56:20 compute-1 python3.9[213068]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:20 compute-1 sudo[213066]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:20 compute-1 systemd-coredump[212911]: Process 191523 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007f587efc132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 17:56:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175620 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:56:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:20 compute-1 systemd[1]: systemd-coredump@7-212910-0.service: Deactivated successfully.
Sep 30 17:56:20 compute-1 systemd[1]: systemd-coredump@7-212910-0.service: Consumed 1.196s CPU time.
Sep 30 17:56:20 compute-1 podman[213106]: 2025-09-30 17:56:20.587650819 +0000 UTC m=+0.052295448 container died 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 17:56:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-75d59e3bcfc7f85ccd733016d271858d475cdb0d7b514acc93a815c10ac35de8-merged.mount: Deactivated successfully.
Sep 30 17:56:20 compute-1 podman[213106]: 2025-09-30 17:56:20.651082006 +0000 UTC m=+0.115726595 container remove 8f035cb777f1dfa05376f5fbe22850cd2e594eb7df94147695e50142fcfd140e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1)
Sep 30 17:56:20 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 17:56:20 compute-1 ceph-mon[75484]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:56:20 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 17:56:20 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.819s CPU time.
Sep 30 17:56:20 compute-1 sudo[213268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkxdsqkmkfupuhfrdappfbbafrgpxoad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254980.5500045-983-99241774902162/AnsiballZ_stat.py'
Sep 30 17:56:20 compute-1 sudo[213268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:21 compute-1 python3.9[213270]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:21 compute-1 sudo[213268]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:21 compute-1 sshd-session[212297]: Connection closed by invalid user rust 192.210.160.141 port 58392 [preauth]
Sep 30 17:56:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:21 compute-1 sshd-session[213094]: Invalid user ubnt from 194.0.234.19 port 31980
Sep 30 17:56:21 compute-1 sshd-session[213094]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:21 compute-1 sshd-session[213094]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.0.234.19
Sep 30 17:56:21 compute-1 sudo[213420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diidprsfgvnkjpxeetdaxksjgtvulshm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254981.3869915-1001-279999265280552/AnsiballZ_stat.py'
Sep 30 17:56:21 compute-1 sudo[213420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:21.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:21 compute-1 python3.9[213422]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:21.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:21 compute-1 sudo[213420]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:22 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Sep 30 17:56:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:22 compute-1 sudo[213576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yesvukpepwuhuekotgboknjshretguuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254982.2152147-1017-279992709032511/AnsiballZ_stat.py'
Sep 30 17:56:22 compute-1 sudo[213576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:22 compute-1 python3.9[213578]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:22 compute-1 sudo[213576]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:22 compute-1 ceph-mon[75484]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:56:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:56:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:23 compute-1 sudo[213700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziquwvleimdvrmmhmuutzpmaxkmnzkxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254982.2152147-1017-279992709032511/AnsiballZ_copy.py'
Sep 30 17:56:23 compute-1 sudo[213700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:23 compute-1 python3.9[213702]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759254982.2152147-1017-279992709032511/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:23 compute-1 sudo[213700]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:23 compute-1 sshd-session[213451]: Invalid user aman from 103.153.190.105 port 45370
Sep 30 17:56:23 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 17:56:23 compute-1 sshd-session[213451]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:23 compute-1 sshd-session[213451]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 17:56:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:23.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:23 compute-1 sshd-session[213094]: Failed password for invalid user ubnt from 194.0.234.19 port 31980 ssh2
Sep 30 17:56:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:23.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:24 compute-1 sudo[213874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdylczznvtuajuytedgolqwlbxphiuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254983.8249614-1047-83577935101798/AnsiballZ_command.py'
Sep 30 17:56:24 compute-1 sudo[213874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:24 compute-1 sudo[213836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:56:24 compute-1 sudo[213836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:56:24 compute-1 sudo[213836]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:24 compute-1 python3.9[213879]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:56:24 compute-1 sudo[213874]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:24 compute-1 ceph-mon[75484]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:56:25 compute-1 sudo[214033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijuhlktktravueadvcdpeyuxzsppycmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254984.7398949-1063-21417401112640/AnsiballZ_lineinfile.py'
Sep 30 17:56:25 compute-1 sudo[214033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175625 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/175625 (4) : backend 'backend' has no server available!
Sep 30 17:56:25 compute-1 python3.9[214035]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.335148) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985335191, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 700, "num_deletes": 255, "total_data_size": 1277641, "memory_usage": 1305936, "flush_reason": "Manual Compaction"}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Sep 30 17:56:25 compute-1 sudo[214033]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985343206, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 840926, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18298, "largest_seqno": 18993, "table_properties": {"data_size": 837503, "index_size": 1267, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7343, "raw_average_key_size": 17, "raw_value_size": 830655, "raw_average_value_size": 2021, "num_data_blocks": 57, "num_entries": 411, "num_filter_entries": 411, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254939, "oldest_key_time": 1759254939, "file_creation_time": 1759254985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 8108 microseconds, and 5104 cpu microseconds.
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.343256) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 840926 bytes OK
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.343278) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.346478) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.346501) EVENT_LOG_v1 {"time_micros": 1759254985346494, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.346522) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1273842, prev total WAL file size 1273842, number of live WAL files 2.
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.347312) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(821KB)], [33(9887KB)]
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985347354, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 10965660, "oldest_snapshot_seqno": -1}
Sep 30 17:56:25 compute-1 sshd-session[213451]: Failed password for invalid user aman from 103.153.190.105 port 45370 ssh2
Sep 30 17:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:25 compute-1 sshd-session[213094]: Connection closed by invalid user ubnt 194.0.234.19 port 31980 [preauth]
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4500 keys, 10530751 bytes, temperature: kUnknown
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985430519, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 10530751, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10500293, "index_size": 18124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11269, "raw_key_size": 115116, "raw_average_key_size": 25, "raw_value_size": 10417954, "raw_average_value_size": 2315, "num_data_blocks": 747, "num_entries": 4500, "num_filter_entries": 4500, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759254985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.431037) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 10530751 bytes
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.432764) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.5 rd, 126.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 9.7 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(25.6) write-amplify(12.5) OK, records in: 5023, records dropped: 523 output_compression: NoCompression
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.432799) EVENT_LOG_v1 {"time_micros": 1759254985432781, "job": 18, "event": "compaction_finished", "compaction_time_micros": 83408, "compaction_time_cpu_micros": 42844, "output_level": 6, "num_output_files": 1, "total_output_size": 10530751, "num_input_records": 5023, "num_output_records": 4500, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985433453, "job": 18, "event": "table_file_deletion", "file_number": 35}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759254985436801, "job": 18, "event": "table_file_deletion", "file_number": 33}
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.347232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.437035) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.437045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.437049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.437057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:56:25.437060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:56:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:25.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:56:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:25.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:56:26 compute-1 sshd-session[213451]: Received disconnect from 103.153.190.105 port 45370:11: Bye Bye [preauth]
Sep 30 17:56:26 compute-1 sshd-session[213451]: Disconnected from invalid user aman 103.153.190.105 port 45370 [preauth]
Sep 30 17:56:26 compute-1 sudo[214186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjamgjyjjavezfeoimtgpzwdcjdexui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254985.5528095-1079-178440049391274/AnsiballZ_replace.py'
Sep 30 17:56:26 compute-1 sudo[214186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:26 compute-1 ceph-mon[75484]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:56:26 compute-1 python3.9[214188]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:26 compute-1 sudo[214186]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:26 compute-1 sudo[214339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kioovixwdpteyhsskrdfjipyzyqehssb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254986.5824914-1095-210046836780316/AnsiballZ_replace.py'
Sep 30 17:56:26 compute-1 sudo[214339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:27 compute-1 python3.9[214341]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:27 compute-1 sudo[214339]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:27 compute-1 sudo[214491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnpuuoamhgdlxmewwbexmaazkifjcwfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254987.3952496-1113-272199864284742/AnsiballZ_lineinfile.py'
Sep 30 17:56:27 compute-1 sudo[214491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:27.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:27 compute-1 python3.9[214493]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:27 compute-1 sudo[214491]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:27.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:28 compute-1 sudo[214644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vienzrtnpwqkaaseuoquhhfobtelgols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254988.1076684-1113-56623991552939/AnsiballZ_lineinfile.py'
Sep 30 17:56:28 compute-1 sudo[214644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:28 compute-1 ceph-mon[75484]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail
Sep 30 17:56:28 compute-1 python3.9[214646]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:28 compute-1 sudo[214644]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:29 compute-1 sudo[214811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vigjesbfsyqwqmnurzuaspvvfdndjaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254988.9121592-1113-101780171917273/AnsiballZ_lineinfile.py'
Sep 30 17:56:29 compute-1 sudo[214811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:29 compute-1 podman[214771]: 2025-09-30 17:56:29.3015114 +0000 UTC m=+0.090193518 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 17:56:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:29 compute-1 python3.9[214819]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:29 compute-1 sudo[214811]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:29.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:29.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:30 compute-1 sudo[214970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iedvuranpjkguvwfzkrxldbdusbmedlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254989.7008748-1113-147812903263867/AnsiballZ_lineinfile.py'
Sep 30 17:56:30 compute-1 sudo[214970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:30 compute-1 python3.9[214972]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:30 compute-1 sudo[214970]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:30 compute-1 ceph-mon[75484]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:56:30 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 8.
Sep 30 17:56:30 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:56:30 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.819s CPU time.
Sep 30 17:56:30 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 17:56:30 compute-1 sudo[215123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikwupjsmilgygefobchpkbtcnzimpyjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254990.534695-1172-82998651239932/AnsiballZ_stat.py'
Sep 30 17:56:30 compute-1 sudo[215123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:31 compute-1 python3.9[215126]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:31 compute-1 sudo[215123]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:31 compute-1 podman[215178]: 2025-09-30 17:56:31.302977031 +0000 UTC m=+0.067159988 container create 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 17:56:31 compute-1 podman[215178]: 2025-09-30 17:56:31.267944529 +0000 UTC m=+0.032127556 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 17:56:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0d211df535c77d24b6b9bb14c26b41ba241a3840c5342297aecc0c2460f1702/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0d211df535c77d24b6b9bb14c26b41ba241a3840c5342297aecc0c2460f1702/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0d211df535c77d24b6b9bb14c26b41ba241a3840c5342297aecc0c2460f1702/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0d211df535c77d24b6b9bb14c26b41ba241a3840c5342297aecc0c2460f1702/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:31 compute-1 podman[215178]: 2025-09-30 17:56:31.416755523 +0000 UTC m=+0.180938490 container init 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid)
Sep 30 17:56:31 compute-1 podman[215178]: 2025-09-30 17:56:31.426785383 +0000 UTC m=+0.190968350 container start 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 17:56:31 compute-1 bash[215178]: 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 17:56:31 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 17:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:56:31 compute-1 sudo[215381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvvgatflafdrdjfwhnzlsclcifnhmcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254991.4283142-1187-115268184272035/AnsiballZ_file.py'
Sep 30 17:56:31 compute-1 sudo[215381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:31.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:31.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:31 compute-1 python3.9[215383]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:32 compute-1 sudo[215381]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:32 compute-1 ceph-mon[75484]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:56:32 compute-1 sudo[215535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snznpbiqvuxakonrcnkhbjysdymocaba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254992.3183813-1205-123639477620404/AnsiballZ_file.py'
Sep 30 17:56:32 compute-1 sudo[215535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:32 compute-1 python3.9[215537]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:32 compute-1 sudo[215535]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:33 compute-1 sudo[215687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfhzmttvzmqucnaskjnmgcbsrhfhype ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254993.210825-1221-197012744298404/AnsiballZ_stat.py'
Sep 30 17:56:33 compute-1 sudo[215687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:33 compute-1 python3.9[215689]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:33 compute-1 sudo[215687]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:33.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:33.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:34 compute-1 sudo[215766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igiglonqxvlkjkgvsdoqrlfkxfniuuhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254993.210825-1221-197012744298404/AnsiballZ_file.py'
Sep 30 17:56:34 compute-1 sudo[215766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:34 compute-1 python3.9[215768]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:34 compute-1 sudo[215766]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:34 compute-1 ceph-mon[75484]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:56:34 compute-1 sudo[215919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbsrviuhedacswmxkjltqhatdmcxucje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254994.548048-1221-73433069871768/AnsiballZ_stat.py'
Sep 30 17:56:34 compute-1 sudo[215919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:35 compute-1 python3.9[215921]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:35 compute-1 sudo[215919]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:35 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Sep 30 17:56:35 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 17:56:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:35 compute-1 sudo[215999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsvyyjgcubbczfenfbxegfdsxndeptyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254994.548048-1221-73433069871768/AnsiballZ_file.py'
Sep 30 17:56:35 compute-1 sudo[215999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:35 compute-1 python3.9[216001]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:35 compute-1 sudo[215999]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:35.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:35.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:36 compute-1 sudo[216165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kasvxogosdfnotosrdrbpqnorqlxjynh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254995.9036698-1267-252319769815772/AnsiballZ_file.py'
Sep 30 17:56:36 compute-1 sudo[216165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:36 compute-1 podman[216126]: 2025-09-30 17:56:36.321526957 +0000 UTC m=+0.095240814 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 17:56:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:36 compute-1 python3.9[216174]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:36 compute-1 sudo[216165]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:36 compute-1 ceph-mon[75484]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:56:37 compute-1 sudo[216326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjbnpllonrdlnhcyiojenirobdehshgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254996.744568-1283-164449616982693/AnsiballZ_stat.py'
Sep 30 17:56:37 compute-1 sudo[216326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:37 compute-1 python3.9[216328]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:37 compute-1 sudo[216326]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Sep 30 17:56:37 compute-1 sudo[216404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqnrwbppmhrfhuqiothfhhyzeciqjevd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254996.744568-1283-164449616982693/AnsiballZ_file.py'
Sep 30 17:56:37 compute-1 sudo[216404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:56:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:37.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:37.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:38 compute-1 python3.9[216406]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:38 compute-1 sudo[216404]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:38 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:38 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:38 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:56:38 compute-1 sudo[216558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huvsakxnwdxltakgfphsvpzchgpfgjzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254998.2821083-1307-147120869369240/AnsiballZ_stat.py'
Sep 30 17:56:38 compute-1 sudo[216558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:38 compute-1 ceph-mon[75484]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:56:38 compute-1 python3.9[216560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:38 compute-1 sudo[216558]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:39 compute-1 sudo[216636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avdsnzgydqhfenoaifvujfoodizzyfsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254998.2821083-1307-147120869369240/AnsiballZ_file.py'
Sep 30 17:56:39 compute-1 sudo[216636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:39 compute-1 python3.9[216638]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:39 compute-1 sudo[216636]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:39.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:39.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:40 compute-1 sudo[216792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvyslroaudbkbcxefepeuoyojuqtpkwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759254999.6799433-1331-159766209780872/AnsiballZ_systemd.py'
Sep 30 17:56:40 compute-1 sudo[216792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:40 compute-1 python3.9[216794]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:56:40 compute-1 systemd[1]: Reloading.
Sep 30 17:56:40 compute-1 systemd-rc-local-generator[216821]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:56:40 compute-1 systemd-sysv-generator[216827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:56:40 compute-1 ceph-mon[75484]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:56:40 compute-1 sudo[216792]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:41 compute-1 unix_chkpwd[216860]: password check failed for user (root)
Sep 30 17:56:41 compute-1 sshd-session[216709]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 17:56:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:41 compute-1 sudo[216986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yawiicnommrqjtapmqinxesodwirufkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255001.161682-1347-207786567910743/AnsiballZ_stat.py'
Sep 30 17:56:41 compute-1 sudo[216986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:41 compute-1 python3.9[216988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:41 compute-1 sudo[216986]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:41.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:41.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:42 compute-1 unix_chkpwd[217039]: password check failed for user (root)
Sep 30 17:56:42 compute-1 sshd-session[216835]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 17:56:42 compute-1 sudo[217066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqmmufdfylttthepjpeorxtzosrwtehp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255001.161682-1347-207786567910743/AnsiballZ_file.py'
Sep 30 17:56:42 compute-1 sudo[217066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:42 compute-1 python3.9[217068]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175642 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:56:42 compute-1 sudo[217066]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:42 compute-1 unix_chkpwd[217082]: password check failed for user (root)
Sep 30 17:56:42 compute-1 sshd-session[216789]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:56:42 compute-1 ceph-mon[75484]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 17:56:42 compute-1 sudo[217220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmtcqdwabitbtkkgcyrmdqpalpjvbdio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255002.6409204-1372-205046058286533/AnsiballZ_stat.py'
Sep 30 17:56:42 compute-1 sudo[217220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:43 compute-1 python3.9[217222]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:43 compute-1 sudo[217220]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:43 compute-1 sshd-session[216709]: Failed password for root from 14.225.167.110 port 59944 ssh2
Sep 30 17:56:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:43 compute-1 sudo[217298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcjhvcuxdkfogfpvhxfhrojnyujpvddd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255002.6409204-1372-205046058286533/AnsiballZ_file.py'
Sep 30 17:56:43 compute-1 sudo[217298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:43 compute-1 python3.9[217300]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:43 compute-1 sudo[217298]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:56:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:43.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:56:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:43.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:44 compute-1 sshd-session[216709]: Received disconnect from 14.225.167.110 port 59944:11: Bye Bye [preauth]
Sep 30 17:56:44 compute-1 sshd-session[216709]: Disconnected from authenticating user root 14.225.167.110 port 59944 [preauth]
Sep 30 17:56:44 compute-1 sudo[217455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfypniqtibpwvaawzmgvvoirdemffojy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255004.026019-1395-67820275324427/AnsiballZ_systemd.py'
Sep 30 17:56:44 compute-1 sudo[217455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:44 compute-1 sudo[217450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:56:44 compute-1 sudo[217450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:56:44 compute-1 sudo[217450]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:44 compute-1 sshd-session[216835]: Failed password for root from 216.10.242.161 port 56280 ssh2
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000014:nfs.cephfs.0: -2
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 17:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:44 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 17:56:44 compute-1 python3.9[217473]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:56:44 compute-1 systemd[1]: Reloading.
Sep 30 17:56:44 compute-1 ceph-mon[75484]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Sep 30 17:56:44 compute-1 systemd-sysv-generator[217524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:56:44 compute-1 sshd-session[216789]: Failed password for root from 192.210.160.141 port 44492 ssh2
Sep 30 17:56:44 compute-1 systemd-rc-local-generator[217520]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:56:45 compute-1 sshd-session[216835]: Received disconnect from 216.10.242.161 port 56280:11: Bye Bye [preauth]
Sep 30 17:56:45 compute-1 sshd-session[216835]: Disconnected from authenticating user root 216.10.242.161 port 56280 [preauth]
Sep 30 17:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:45 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 17:56:45 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 17:56:45 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 17:56:45 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 17:56:45 compute-1 sudo[217455]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:45 compute-1 sshd-session[216789]: Connection closed by authenticating user root 192.210.160.141 port 44492 [preauth]
Sep 30 17:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:45 compute-1 sudo[217685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoqzeaskmxyhvsnztdpzaoifdmcpdobp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255005.5961525-1415-229042556277298/AnsiballZ_file.py'
Sep 30 17:56:45 compute-1 sudo[217685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:45.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:46 compute-1 python3.9[217688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:46 compute-1 sudo[217685]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:46 compute-1 sudo[217839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejfjdqukawucyzrswsqflyucqwxzdkiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255006.4791782-1431-273323037970051/AnsiballZ_stat.py'
Sep 30 17:56:46 compute-1 sudo[217839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:46 compute-1 ceph-mon[75484]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:56:47 compute-1 python3.9[217841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:47 compute-1 sudo[217839]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175647 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:47 compute-1 sudo[217962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwulvhevrfazoyrixxznluxmanmzbewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255006.4791782-1431-273323037970051/AnsiballZ_copy.py'
Sep 30 17:56:47 compute-1 sudo[217962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:47 compute-1 python3.9[217964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255006.4791782-1431-273323037970051/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:47 compute-1 sudo[217962]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:47.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:47.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:48 compute-1 ceph-mon[75484]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:56:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:48 compute-1 podman[218063]: 2025-09-30 17:56:48.62019416 +0000 UTC m=+0.153993745 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 17:56:48 compute-1 sudo[218143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osqgzsunvrjxrmbsupxizvjrijbauikh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255008.2518528-1466-8668949257599/AnsiballZ_file.py'
Sep 30 17:56:48 compute-1 sudo[218143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:48 compute-1 python3.9[218145]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:56:48 compute-1 sudo[218143]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:49 compute-1 sudo[218295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lniddprjogmdtewookzkzijeuwkugmyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255009.1817064-1481-196711591619440/AnsiballZ_stat.py'
Sep 30 17:56:49 compute-1 sudo[218295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:49 compute-1 python3.9[218297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:56:49 compute-1 sudo[218295]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:49.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:49.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:50 compute-1 sudo[218419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxhvznphsktvhehwxeioigouoeilnsyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255009.1817064-1481-196711591619440/AnsiballZ_copy.py'
Sep 30 17:56:50 compute-1 sudo[218419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:50 compute-1 python3.9[218421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255009.1817064-1481-196711591619440/.source.json _original_basename=.ezrlr6nl follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:50 compute-1 sudo[218419]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:50 compute-1 ceph-mon[75484]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:51 compute-1 sudo[218572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-golrmwjemsbozgxaccmebtgiokudbqaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255010.7815053-1511-134256283556578/AnsiballZ_file.py'
Sep 30 17:56:51 compute-1 sudo[218572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:51 compute-1 python3.9[218574]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:56:51 compute-1 sudo[218572]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:51.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:51.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:52 compute-1 sudo[218725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqoqqhhvdvaklubwvpktkrrfadbypkng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255011.824185-1527-164073120658577/AnsiballZ_stat.py'
Sep 30 17:56:52 compute-1 sudo[218725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:52 compute-1 sudo[218725]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:52 compute-1 ceph-mon[75484]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:56:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:56:52 compute-1 sudo[218849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcljvdmkuyueghhgxggbiljavepngwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255011.824185-1527-164073120658577/AnsiballZ_copy.py'
Sep 30 17:56:52 compute-1 sudo[218849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:53 compute-1 sudo[218849]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:53 compute-1 sudo[219001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrsznnmjbioqtauwzrfilhawyldydwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255013.4885523-1561-119356498667182/AnsiballZ_container_config_data.py'
Sep 30 17:56:53 compute-1 sudo[219001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:53.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:53.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:54 compute-1 python3.9[219003]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Sep 30 17:56:54 compute-1 sudo[219001]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:56:54.298 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:56:54.299 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:56:54.299 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:56:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:54 compute-1 ceph-mon[75484]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Sep 30 17:56:55 compute-1 sudo[219156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqaxsskoodocqgyondezemmohslzafpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255014.372289-1579-38198702443901/AnsiballZ_container_config_hash.py'
Sep 30 17:56:55 compute-1 sudo[219156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:55 compute-1 python3.9[219158]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:56:55 compute-1 sudo[219156]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:55.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:55 compute-1 sudo[219308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvcuwilrnhsvuiiwrpnjvzfnmagnamfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255015.6087363-1597-192525450602543/AnsiballZ_podman_container_info.py'
Sep 30 17:56:55 compute-1 sudo[219308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:56.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:56 compute-1 python3.9[219311]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 17:56:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:56 compute-1 sudo[219308]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:56 compute-1 ceph-mon[75484]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36040089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:57 compute-1 sshd-session[219364]: Invalid user deb from 107.172.146.104 port 37592
Sep 30 17:56:57 compute-1 sshd-session[219364]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:56:57 compute-1 sshd-session[219364]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:57 compute-1 sudo[219491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiujssxkgrsnksdsrlghngokcqbeutrz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255017.4008198-1623-143650413868551/AnsiballZ_edpm_container_manage.py'
Sep 30 17:56:57 compute-1 sudo[219491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:57.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:56:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:56:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:56:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:56:58 compute-1 python3[219493]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:56:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:56:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:58 compute-1 podman[219507]: 2025-09-30 17:56:58.56404661 +0000 UTC m=+0.429852789 image pull e99d9627280779529e99daa6a112e310843a207a3acc590902c030127020a067 38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 17:56:58 compute-1 sshd-session[219364]: Failed password for invalid user deb from 107.172.146.104 port 37592 ssh2
Sep 30 17:56:58 compute-1 ceph-mon[75484]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:56:58 compute-1 podman[219564]: 2025-09-30 17:56:58.747721853 +0000 UTC m=+0.071377092 container create 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:56:58 compute-1 podman[219564]: 2025-09-30 17:56:58.71117959 +0000 UTC m=+0.034834879 image pull e99d9627280779529e99daa6a112e310843a207a3acc590902c030127020a067 38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 17:56:58 compute-1 python3[219493]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 17:56:58 compute-1 sshd-session[219364]: Received disconnect from 107.172.146.104 port 37592:11: Bye Bye [preauth]
Sep 30 17:56:58 compute-1 sshd-session[219364]: Disconnected from invalid user deb 107.172.146.104 port 37592 [preauth]
Sep 30 17:56:58 compute-1 sudo[219491]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:56:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:56:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:56:59 compute-1 podman[219726]: 2025-09-30 17:56:59.512364451 +0000 UTC m=+0.091525654 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 17:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:56:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:56:59 compute-1 sudo[219769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-offtbgyyggmtpvwpjdeulrkrvrygqmsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255019.1670806-1639-213118745607968/AnsiballZ_stat.py'
Sep 30 17:56:59 compute-1 sudo[219769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:56:59 compute-1 python3.9[219774]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:56:59 compute-1 sudo[219769]: pam_unix(sudo:session): session closed for user root
Sep 30 17:56:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:56:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:56:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:56:59.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:00.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:00 compute-1 sudo[219927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgyhldowfrrrfpjgkivtgakfpgrhxibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255020.040705-1657-166874662145420/AnsiballZ_file.py'
Sep 30 17:57:00 compute-1 sudo[219927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:00 compute-1 python3.9[219929]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:00 compute-1 sudo[219927]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:00 compute-1 ceph-mon[75484]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 17:57:01 compute-1 sudo[220004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpdopralpwajtufqcfcuruqdfuzrcdwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255020.040705-1657-166874662145420/AnsiballZ_stat.py'
Sep 30 17:57:01 compute-1 sudo[220004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36040089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:01 compute-1 python3.9[220006]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:57:01 compute-1 sudo[220004]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:01 compute-1 sudo[220156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieapffvhehfazcwjqbwndglmxtpxabkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255021.329534-1657-276425418467494/AnsiballZ_copy.py'
Sep 30 17:57:01 compute-1 sudo[220156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:01.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:02.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:02 compute-1 python3.9[220158]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759255021.329534-1657-276425418467494/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:02 compute-1 sudo[220156]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:02 compute-1 sudo[220233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hitesxiiosguaycqieobwzthokgqwktm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255021.329534-1657-276425418467494/AnsiballZ_systemd.py'
Sep 30 17:57:02 compute-1 sudo[220233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:02 compute-1 python3.9[220235]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:57:02 compute-1 systemd[1]: Reloading.
Sep 30 17:57:02 compute-1 ceph-mon[75484]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:02 compute-1 systemd-rc-local-generator[220261]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:02 compute-1 systemd-sysv-generator[220265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:03 compute-1 sudo[220271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:57:03 compute-1 sudo[220271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:03 compute-1 sudo[220271]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:03 compute-1 sudo[220233]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:03 compute-1 sudo[220296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:57:03 compute-1 sudo[220296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:03 compute-1 sudo[220406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgkpyvgdmqgeoxhaynfycgnucneojdvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255021.329534-1657-276425418467494/AnsiballZ_systemd.py'
Sep 30 17:57:03 compute-1 sudo[220406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:03 compute-1 sudo[220296]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:03 compute-1 python3.9[220408]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:03.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:04.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:04 compute-1 sudo[220430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:57:04 compute-1 sudo[220430]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:04 compute-1 sudo[220430]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:04 compute-1 ceph-mon[75484]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:57:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:57:04 compute-1 systemd[1]: Reloading.
Sep 30 17:57:04 compute-1 systemd-rc-local-generator[220484]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:04 compute-1 systemd-sysv-generator[220487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36040096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:05 compute-1 systemd[1]: Starting multipathd container...
Sep 30 17:57:05 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:57:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77a5052124cbdff4a2caca56ae49ea789d0f478550a8420df12af9b0355cbc3f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 17:57:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77a5052124cbdff4a2caca56ae49ea789d0f478550a8420df12af9b0355cbc3f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:57:05 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.
Sep 30 17:57:05 compute-1 podman[220494]: 2025-09-30 17:57:05.371154958 +0000 UTC m=+0.161749434 container init 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:57:05 compute-1 multipathd[220509]: + sudo -E kolla_set_configs
Sep 30 17:57:05 compute-1 podman[220494]: 2025-09-30 17:57:05.403742015 +0000 UTC m=+0.194336471 container start 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 17:57:05 compute-1 podman[220494]: multipathd
Sep 30 17:57:05 compute-1 sudo[220515]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 17:57:05 compute-1 sudo[220515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:57:05 compute-1 systemd[1]: Started multipathd container.
Sep 30 17:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:05 compute-1 sudo[220406]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:05 compute-1 multipathd[220509]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:57:05 compute-1 multipathd[220509]: INFO:__main__:Validating config file
Sep 30 17:57:05 compute-1 multipathd[220509]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:57:05 compute-1 multipathd[220509]: INFO:__main__:Writing out command to execute
Sep 30 17:57:05 compute-1 sudo[220515]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:05 compute-1 multipathd[220509]: ++ cat /run_command
Sep 30 17:57:05 compute-1 multipathd[220509]: + CMD='/usr/sbin/multipathd -d'
Sep 30 17:57:05 compute-1 multipathd[220509]: + ARGS=
Sep 30 17:57:05 compute-1 multipathd[220509]: + sudo kolla_copy_cacerts
Sep 30 17:57:05 compute-1 sudo[220546]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 17:57:05 compute-1 sudo[220546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:57:05 compute-1 sudo[220546]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:05 compute-1 multipathd[220509]: + [[ ! -n '' ]]
Sep 30 17:57:05 compute-1 multipathd[220509]: + . kolla_extend_start
Sep 30 17:57:05 compute-1 multipathd[220509]: Running command: '/usr/sbin/multipathd -d'
Sep 30 17:57:05 compute-1 multipathd[220509]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 17:57:05 compute-1 multipathd[220509]: + umask 0022
Sep 30 17:57:05 compute-1 multipathd[220509]: + exec /usr/sbin/multipathd -d
Sep 30 17:57:05 compute-1 podman[220516]: 2025-09-30 17:57:05.512475751 +0000 UTC m=+0.097321360 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:05 compute-1 systemd[1]: 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-357ceb59e03fd932.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 17:57:05 compute-1 systemd[1]: 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-357ceb59e03fd932.service: Failed with result 'exit-code'.
Sep 30 17:57:05 compute-1 multipathd[220509]: 12867.164445 | multipathd v0.9.9: start up
Sep 30 17:57:05 compute-1 multipathd[220509]: 12867.171656 | reconfigure: setting up paths and maps
Sep 30 17:57:05 compute-1 multipathd[220509]: 12867.173663 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Sep 30 17:57:05 compute-1 multipathd[220509]: 12867.175169 | updated bindings file /etc/multipath/bindings
Sep 30 17:57:05 compute-1 unix_chkpwd[220619]: password check failed for user (root)
Sep 30 17:57:05 compute-1 sshd-session[220427]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:57:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:06.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:06 compute-1 python3.9[220702]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:57:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:06 compute-1 podman[220752]: 2025-09-30 17:57:06.550901846 +0000 UTC m=+0.084906016 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 17:57:06 compute-1 sudo[220876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukyceqhptjpjvhkefykxmerxtsbywpfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255026.4338877-1729-249654294551311/AnsiballZ_command.py'
Sep 30 17:57:06 compute-1 sudo[220876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:06 compute-1 ceph-mon[75484]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:07 compute-1 python3.9[220878]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:57:07 compute-1 sshd-session[220654]: Invalid user itt from 175.126.165.170 port 52804
Sep 30 17:57:07 compute-1 sshd-session[220654]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:07 compute-1 sshd-session[220654]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 17:57:07 compute-1 sudo[220876]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:07 compute-1 sudo[221040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psfqmvsweslciivcbgeprjcsmsrqsbwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255027.3214128-1745-199342168605269/AnsiballZ_systemd.py'
Sep 30 17:57:07 compute-1 sudo[221040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:57:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:07.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:07 compute-1 sshd-session[220427]: Failed password for root from 192.210.160.141 port 58804 ssh2
Sep 30 17:57:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:08.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:08 compute-1 python3.9[221042]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:57:08 compute-1 systemd[1]: Stopping multipathd container...
Sep 30 17:57:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:08 compute-1 sshd-session[220654]: Failed password for invalid user itt from 175.126.165.170 port 52804 ssh2
Sep 30 17:57:08 compute-1 multipathd[220509]: 12869.858767 | multipathd: shut down
Sep 30 17:57:08 compute-1 systemd[1]: libpod-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.scope: Deactivated successfully.
Sep 30 17:57:08 compute-1 podman[221047]: 2025-09-30 17:57:08.271515079 +0000 UTC m=+0.117349279 container died 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 17:57:08 compute-1 systemd[1]: 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-357ceb59e03fd932.timer: Deactivated successfully.
Sep 30 17:57:08 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.
Sep 30 17:57:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-userdata-shm.mount: Deactivated successfully.
Sep 30 17:57:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-77a5052124cbdff4a2caca56ae49ea789d0f478550a8420df12af9b0355cbc3f-merged.mount: Deactivated successfully.
Sep 30 17:57:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:08 compute-1 podman[221047]: 2025-09-30 17:57:08.426818169 +0000 UTC m=+0.272652349 container cleanup 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 17:57:08 compute-1 podman[221047]: multipathd
Sep 30 17:57:08 compute-1 podman[221076]: multipathd
Sep 30 17:57:08 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Sep 30 17:57:08 compute-1 systemd[1]: Stopped multipathd container.
Sep 30 17:57:08 compute-1 systemd[1]: Starting multipathd container...
Sep 30 17:57:08 compute-1 sshd-session[220427]: Connection closed by authenticating user root 192.210.160.141 port 58804 [preauth]
Sep 30 17:57:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:57:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77a5052124cbdff4a2caca56ae49ea789d0f478550a8420df12af9b0355cbc3f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 17:57:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77a5052124cbdff4a2caca56ae49ea789d0f478550a8420df12af9b0355cbc3f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:57:08 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.
Sep 30 17:57:08 compute-1 podman[221089]: 2025-09-30 17:57:08.735979919 +0000 UTC m=+0.176191193 container init 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 17:57:08 compute-1 multipathd[221106]: + sudo -E kolla_set_configs
Sep 30 17:57:08 compute-1 podman[221089]: 2025-09-30 17:57:08.773887499 +0000 UTC m=+0.214098753 container start 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 17:57:08 compute-1 sudo[221112]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 17:57:08 compute-1 podman[221089]: multipathd
Sep 30 17:57:08 compute-1 sudo[221112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:57:08 compute-1 systemd[1]: Started multipathd container.
Sep 30 17:57:08 compute-1 sudo[221040]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:08 compute-1 multipathd[221106]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:57:08 compute-1 multipathd[221106]: INFO:__main__:Validating config file
Sep 30 17:57:08 compute-1 multipathd[221106]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:57:08 compute-1 multipathd[221106]: INFO:__main__:Writing out command to execute
Sep 30 17:57:08 compute-1 sudo[221112]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:08 compute-1 multipathd[221106]: ++ cat /run_command
Sep 30 17:57:08 compute-1 multipathd[221106]: + CMD='/usr/sbin/multipathd -d'
Sep 30 17:57:08 compute-1 multipathd[221106]: + ARGS=
Sep 30 17:57:08 compute-1 multipathd[221106]: + sudo kolla_copy_cacerts
Sep 30 17:57:08 compute-1 sudo[221138]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 17:57:08 compute-1 podman[221113]: 2025-09-30 17:57:08.869594255 +0000 UTC m=+0.078750701 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 17:57:08 compute-1 sudo[221138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 17:57:08 compute-1 ceph-mon[75484]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:08 compute-1 sudo[221138]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:08 compute-1 multipathd[221106]: + [[ ! -n '' ]]
Sep 30 17:57:08 compute-1 multipathd[221106]: + . kolla_extend_start
Sep 30 17:57:08 compute-1 multipathd[221106]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 17:57:08 compute-1 multipathd[221106]: Running command: '/usr/sbin/multipathd -d'
Sep 30 17:57:08 compute-1 multipathd[221106]: + umask 0022
Sep 30 17:57:08 compute-1 multipathd[221106]: + exec /usr/sbin/multipathd -d
Sep 30 17:57:08 compute-1 systemd[1]: 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-4f9a6b7d4d3fc75a.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 17:57:08 compute-1 systemd[1]: 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7-4f9a6b7d4d3fc75a.service: Failed with result 'exit-code'.
Sep 30 17:57:08 compute-1 multipathd[221106]: 12870.533874 | multipathd v0.9.9: start up
Sep 30 17:57:08 compute-1 multipathd[221106]: 12870.543759 | reconfigure: setting up paths and maps
Sep 30 17:57:09 compute-1 sshd-session[220654]: Received disconnect from 175.126.165.170 port 52804:11: Bye Bye [preauth]
Sep 30 17:57:09 compute-1 sshd-session[220654]: Disconnected from invalid user itt 175.126.165.170 port 52804 [preauth]
Sep 30 17:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36040096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:09 compute-1 sudo[221224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:57:09 compute-1 sudo[221224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:09 compute-1 sudo[221224]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:09 compute-1 sudo[221323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbepjqtdrtywojirpfyuivlippfoaedo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255029.0198631-1761-144504277196461/AnsiballZ_file.py'
Sep 30 17:57:09 compute-1 sudo[221323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:09 compute-1 python3.9[221325]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:09 compute-1 sudo[221323]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:09 compute-1 unix_chkpwd[221350]: password check failed for user (root)
Sep 30 17:57:09 compute-1 sshd-session[221300]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 17:57:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:57:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:09.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:57:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:10.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:57:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:57:10 compute-1 ceph-mon[75484]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 340 B/s rd, 0 op/s
Sep 30 17:57:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:10 compute-1 sudo[221477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igzrssdsrdsnpkyagqztobxdctkqgzql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255030.0985208-1785-216622521787057/AnsiballZ_file.py'
Sep 30 17:57:10 compute-1 sudo[221477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:10 compute-1 python3.9[221479]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 17:57:10 compute-1 sudo[221477]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:11 compute-1 sudo[221630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzqaxnaraiorhijskvdarkgxuusnebff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255030.9055161-1801-101506497950739/AnsiballZ_modprobe.py'
Sep 30 17:57:11 compute-1 sudo[221630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:11 compute-1 sshd-session[221300]: Failed password for root from 167.172.43.167 port 47294 ssh2
Sep 30 17:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:11 compute-1 python3.9[221632]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Sep 30 17:57:11 compute-1 kernel: Key type psk registered
Sep 30 17:57:11 compute-1 sudo[221630]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:11.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:12.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:12 compute-1 sshd-session[220140]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:57:12 compute-1 sshd-session[220140]: banner exchange: Connection from 14.103.129.43 port 60962: Connection timed out
Sep 30 17:57:12 compute-1 sudo[221792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjluixgowsjhqktlmpnxnvnxynytcwvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255031.7436554-1817-60058478969802/AnsiballZ_stat.py'
Sep 30 17:57:12 compute-1 sudo[221792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:12 compute-1 python3.9[221794]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:57:12 compute-1 sudo[221792]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:12 compute-1 sshd-session[221300]: Received disconnect from 167.172.43.167 port 47294:11: Bye Bye [preauth]
Sep 30 17:57:12 compute-1 sshd-session[221300]: Disconnected from authenticating user root 167.172.43.167 port 47294 [preauth]
Sep 30 17:57:12 compute-1 ceph-mon[75484]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:12 compute-1 sudo[221918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwwoshtzqzkpcfiwasaprtccfyvjplqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255031.7436554-1817-60058478969802/AnsiballZ_copy.py'
Sep 30 17:57:12 compute-1 sudo[221918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:13 compute-1 python3.9[221920]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255031.7436554-1817-60058478969802/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:13 compute-1 sudo[221918]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36040096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:13 compute-1 sshd-session[221795]: Invalid user minecraft from 84.51.43.58 port 56902
Sep 30 17:57:13 compute-1 sshd-session[221795]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:13 compute-1 sshd-session[221795]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 17:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:13 compute-1 sudo[222070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edcotuljzacbtuxosilbokxdzwtewhcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255033.305147-1849-171308422766180/AnsiballZ_lineinfile.py'
Sep 30 17:57:13 compute-1 sudo[222070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:13 compute-1 python3.9[222072]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:13 compute-1 sudo[222070]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:13.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:14.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:14 compute-1 sudo[222223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scvhmlsclzhpfawaexxtwlbjznstedeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255034.09375-1865-57015770387105/AnsiballZ_systemd.py'
Sep 30 17:57:14 compute-1 sudo[222223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:14 compute-1 ceph-mon[75484]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:14 compute-1 python3.9[222225]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:57:14 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 17:57:14 compute-1 systemd[1]: Stopped Load Kernel Modules.
Sep 30 17:57:14 compute-1 systemd[1]: Stopping Load Kernel Modules...
Sep 30 17:57:14 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 17:57:14 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 17:57:14 compute-1 sudo[222223]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:15 compute-1 sshd-session[221795]: Failed password for invalid user minecraft from 84.51.43.58 port 56902 ssh2
Sep 30 17:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:15 compute-1 sudo[222380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeiahkulxrkahsyittvqurpymcfiqszr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255035.1333337-1881-22526537417587/AnsiballZ_setup.py'
Sep 30 17:57:15 compute-1 sudo[222380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:15 compute-1 sshd-session[221795]: Received disconnect from 84.51.43.58 port 56902:11: Bye Bye [preauth]
Sep 30 17:57:15 compute-1 sshd-session[221795]: Disconnected from invalid user minecraft 84.51.43.58 port 56902 [preauth]
Sep 30 17:57:15 compute-1 python3.9[222382]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 17:57:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:15.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:16.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:16 compute-1 sudo[222380]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:16 compute-1 sudo[222465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmuqdeoymivyuvqfhpfjkskaxxazxxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255035.1333337-1881-22526537417587/AnsiballZ_dnf.py'
Sep 30 17:57:16 compute-1 sudo[222465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:16 compute-1 ceph-mon[75484]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:16 compute-1 python3.9[222468]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 17:57:16 compute-1 sshd-session[220798]: error: kex_exchange_identification: read: Connection timed out
Sep 30 17:57:16 compute-1 sshd-session[220798]: banner exchange: Connection from 101.126.25.120 port 60342: Connection timed out
Sep 30 17:57:16 compute-1 sshd[170789]: drop connection #0 from [110.42.70.108]:48502 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 17:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:17.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:18.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:18 compute-1 ceph-mon[75484]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:19 compute-1 podman[222475]: 2025-09-30 17:57:19.582122562 +0000 UTC m=+0.117859003 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 17:57:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:19.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:20.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:20 compute-1 ceph-mon[75484]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:57:20 compute-1 unix_chkpwd[222505]: password check failed for user (root)
Sep 30 17:57:20 compute-1 sshd-session[222501]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 17:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:22.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:22 compute-1 sshd-session[222501]: Failed password for root from 194.107.115.65 port 20080 ssh2
Sep 30 17:57:22 compute-1 ceph-mon[75484]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:57:22 compute-1 unix_chkpwd[222513]: password check failed for user (openvswitch)
Sep 30 17:57:22 compute-1 sshd-session[222511]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239  user=openvswitch
Sep 30 17:57:23 compute-1 systemd[1]: Reloading.
Sep 30 17:57:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:23 compute-1 systemd-rc-local-generator[222542]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:23 compute-1 systemd-sysv-generator[222547]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:23 compute-1 systemd[1]: Reloading.
Sep 30 17:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:23 compute-1 systemd-rc-local-generator[222575]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:23 compute-1 systemd-sysv-generator[222581]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:23 compute-1 sshd-session[222501]: Received disconnect from 194.107.115.65 port 20080:11: Bye Bye [preauth]
Sep 30 17:57:23 compute-1 sshd-session[222501]: Disconnected from authenticating user root 194.107.115.65 port 20080 [preauth]
Sep 30 17:57:23 compute-1 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 17:57:23 compute-1 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 17:57:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:23.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:23 compute-1 lvm[222628]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 17:57:23 compute-1 lvm[222628]: VG ceph_vg0 finished
Sep 30 17:57:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:24.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:24 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 17:57:24 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 17:57:24 compute-1 systemd[1]: Reloading.
Sep 30 17:57:24 compute-1 systemd-rc-local-generator[222677]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:24 compute-1 systemd-sysv-generator[222682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:24 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 17:57:24 compute-1 sudo[222858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:57:24 compute-1 sudo[222858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:24 compute-1 sudo[222858]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:24 compute-1 ceph-mon[75484]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:25 compute-1 sudo[222465]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:25 compute-1 sshd-session[222511]: Failed password for openvswitch from 167.71.248.239 port 47912 ssh2
Sep 30 17:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:25.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:26 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 17:57:26 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 17:57:26 compute-1 systemd[1]: man-db-cache-update.service: Consumed 2.189s CPU time.
Sep 30 17:57:26 compute-1 systemd[1]: run-r3823b89bc14c4c6bbd0a5070cd78dd3c.service: Deactivated successfully.
Sep 30 17:57:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:26.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:26 compute-1 sudo[223991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyxzmqfgahenmyledpopnjftapkiuhxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255046.0343723-1905-258232801458015/AnsiballZ_file.py'
Sep 30 17:57:26 compute-1 sudo[223991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:26 compute-1 python3.9[223993]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:26 compute-1 sudo[223991]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:26 compute-1 ceph-mon[75484]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:27 compute-1 sshd-session[222511]: Connection closed by authenticating user openvswitch 167.71.248.239 port 47912 [preauth]
Sep 30 17:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:27 compute-1 python3.9[224146]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:27.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:27 compute-1 sshd-session[223995]: Invalid user elasticsearch from 192.210.160.141 port 59952
Sep 30 17:57:27 compute-1 sshd-session[223995]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:27 compute-1 sshd-session[223995]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 17:57:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:28.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:28 compute-1 sudo[224301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avltbneuqrdkcfhktadfyqdvshfzjjea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255048.082976-1940-233548398052930/AnsiballZ_file.py'
Sep 30 17:57:28 compute-1 sudo[224301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:28 compute-1 python3.9[224303]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:28 compute-1 sudo[224301]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:28 compute-1 ceph-mon[75484]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:29 compute-1 sudo[224464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aloghljgdkvowctdfbezwmixqbfgvnte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255049.20064-1962-277460800955287/AnsiballZ_systemd_service.py'
Sep 30 17:57:29 compute-1 sudo[224464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:29 compute-1 podman[224428]: 2025-09-30 17:57:29.947150127 +0000 UTC m=+0.091554375 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 17:57:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 17:57:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:29.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 17:57:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:30.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:30 compute-1 python3.9[224473]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:57:30 compute-1 sshd-session[223995]: Failed password for invalid user elasticsearch from 192.210.160.141 port 59952 ssh2
Sep 30 17:57:30 compute-1 systemd[1]: Reloading.
Sep 30 17:57:30 compute-1 systemd-rc-local-generator[224504]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:30 compute-1 systemd-sysv-generator[224507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:30 compute-1 sudo[224464]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:30 compute-1 ceph-mon[75484]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:57:31 compute-1 sshd-session[223995]: Connection closed by invalid user elasticsearch 192.210.160.141 port 59952 [preauth]
Sep 30 17:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:31 compute-1 python3.9[224662]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:31 compute-1 network[224679]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:57:31 compute-1 network[224680]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:57:31 compute-1 network[224681]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:31.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:32.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:32 compute-1 ceph-mon[75484]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36000034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:33.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:34.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:34 compute-1 ceph-mon[75484]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:35.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:36.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:36 compute-1 podman[224796]: 2025-09-30 17:57:36.717199626 +0000 UTC m=+0.098435070 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 17:57:36 compute-1 ceph-mon[75484]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:37.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:57:38 compute-1 sudo[224984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcpstjwfuxpnexefjnulyewqzedmvuzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255057.6163836-2000-261454744395662/AnsiballZ_systemd_service.py'
Sep 30 17:57:38 compute-1 sudo[224984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:38 compute-1 python3.9[224986]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:38 compute-1 sudo[224984]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:38 compute-1 sudo[225138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqyjmpgpeppecoslzhuebpjtqrapnudk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255058.5132997-2000-173780834260813/AnsiballZ_systemd_service.py'
Sep 30 17:57:38 compute-1 sudo[225138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:39 compute-1 ceph-mon[75484]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:39 compute-1 python3.9[225140]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:39 compute-1 sudo[225138]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:39 compute-1 podman[225142]: 2025-09-30 17:57:39.296782866 +0000 UTC m=+0.083912039 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 17:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:39 compute-1 sudo[225309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abwzmjlcblukjctorkfoeqjgpxwbvqpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255059.361266-2000-259058264480576/AnsiballZ_systemd_service.py'
Sep 30 17:57:39 compute-1 sudo[225309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:39 compute-1 python3.9[225311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:39.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:40 compute-1 sudo[225309]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:40 compute-1 ceph-mon[75484]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:57:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:40 compute-1 sudo[225465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgvbdylmturvfdsfwwqffluawhvbgmos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255060.1767912-2000-186342794606225/AnsiballZ_systemd_service.py'
Sep 30 17:57:40 compute-1 sudo[225465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:40 compute-1 python3.9[225467]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:40 compute-1 sudo[225465]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:41 compute-1 sudo[225619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hacpjhgjawdigiuthukxmjorthsxfynp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255061.069117-2000-200434537618803/AnsiballZ_systemd_service.py'
Sep 30 17:57:41 compute-1 sudo[225619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:41 compute-1 sshd-session[225444]: Invalid user reelforge from 14.103.129.43 port 41122
Sep 30 17:57:41 compute-1 sshd-session[225444]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:41 compute-1 sshd-session[225444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.103.129.43
Sep 30 17:57:41 compute-1 python3.9[225621]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:41 compute-1 sudo[225619]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:57:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:41.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:57:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:42 compute-1 sudo[225773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flllwudqunmwhkexphusccrtzankdteq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255061.9271657-2000-129282302194796/AnsiballZ_systemd_service.py'
Sep 30 17:57:42 compute-1 sudo[225773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:42 compute-1 python3.9[225775]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:42 compute-1 sudo[225773]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:42 compute-1 ceph-mon[75484]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:43 compute-1 sudo[225927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqccpnguxawgwvciikjtxqilucmxcaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255062.79235-2000-193267949365432/AnsiballZ_systemd_service.py'
Sep 30 17:57:43 compute-1 sudo[225927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:43 compute-1 python3.9[225929]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:43 compute-1 sudo[225927]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:43 compute-1 sshd-session[225444]: Failed password for invalid user reelforge from 14.103.129.43 port 41122 ssh2
Sep 30 17:57:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:43.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:44.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:44 compute-1 sudo[226083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxemujayiapeaklvqrejtzekqvivypb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255063.7866578-2000-271868471058685/AnsiballZ_systemd_service.py'
Sep 30 17:57:44 compute-1 sudo[226083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:44 compute-1 python3.9[226085]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:57:44 compute-1 sudo[226083]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:44 compute-1 sshd-session[225930]: Invalid user github from 216.10.242.161 port 53188
Sep 30 17:57:44 compute-1 sshd-session[225930]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:44 compute-1 sshd-session[225930]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 17:57:44 compute-1 ceph-mon[75484]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:44 compute-1 sudo[226112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:57:44 compute-1 sudo[226112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:57:44 compute-1 sudo[226112]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:45 compute-1 sudo[226262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eajugrhwpenwkrlmjvvjvjlynspotyol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255064.93727-2118-45729203378853/AnsiballZ_file.py'
Sep 30 17:57:45 compute-1 sudo[226262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:45 compute-1 python3.9[226264]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:45 compute-1 sudo[226262]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:45 compute-1 sshd-session[225930]: Failed password for invalid user github from 216.10.242.161 port 53188 ssh2
Sep 30 17:57:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:45.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:46.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:46 compute-1 sudo[226415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apzzxntspuxglhrxjvcymyjtfteuuagg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255065.7254887-2118-271379942985730/AnsiballZ_file.py'
Sep 30 17:57:46 compute-1 sudo[226415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:46 compute-1 python3.9[226417]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:46 compute-1 sudo[226415]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:46 compute-1 ceph-mon[75484]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:46 compute-1 sshd-session[225930]: Received disconnect from 216.10.242.161 port 53188:11: Bye Bye [preauth]
Sep 30 17:57:46 compute-1 sshd-session[225930]: Disconnected from invalid user github 216.10.242.161 port 53188 [preauth]
Sep 30 17:57:46 compute-1 sudo[226568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwplnlfhkpmbxkkrnfontzjtdtlwcidl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255066.4830391-2118-77316118682548/AnsiballZ_file.py'
Sep 30 17:57:46 compute-1 sudo[226568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:47 compute-1 python3.9[226570]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:47 compute-1 sudo[226568]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:47 compute-1 sudo[226722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvqtdireghfixepygxocrwrdgszehhml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255067.1516457-2118-5770678690296/AnsiballZ_file.py'
Sep 30 17:57:47 compute-1 sudo[226722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:47 compute-1 python3.9[226724]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:47 compute-1 sudo[226722]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:47.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:48.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:48 compute-1 sudo[226875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrtmsejycjamavbziyvlypeximobgzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255067.7705781-2118-214891539117211/AnsiballZ_file.py'
Sep 30 17:57:48 compute-1 sudo[226875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:48 compute-1 sshd-session[226571]: Invalid user k8s from 14.225.167.110 port 60454
Sep 30 17:57:48 compute-1 sshd-session[226571]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:57:48 compute-1 sshd-session[226571]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 17:57:48 compute-1 python3.9[226877]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:48 compute-1 sudo[226875]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:48 compute-1 ceph-mon[75484]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:48 compute-1 sudo[227028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvbfbdfwjhoqyakmxokpdndoosuqruim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255068.5484116-2118-105140675023321/AnsiballZ_file.py'
Sep 30 17:57:48 compute-1 sudo[227028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:49 compute-1 python3.9[227030]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:49 compute-1 sudo[227028]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:49 compute-1 sudo[227183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unxkjbwewpxccovzcqdiynmurvwmuynf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255069.2666492-2118-44843206023540/AnsiballZ_file.py'
Sep 30 17:57:49 compute-1 sudo[227183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:49 compute-1 podman[227185]: 2025-09-30 17:57:49.805180281 +0000 UTC m=+0.136909906 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 17:57:49 compute-1 python3.9[227186]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:49 compute-1 sudo[227183]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:49.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:50.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:50 compute-1 sudo[227364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszvnmxktomargbikjckxhffgapwnfoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255070.0797813-2118-261389169462820/AnsiballZ_file.py'
Sep 30 17:57:50 compute-1 sudo[227364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:50 compute-1 python3.9[227366]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:50 compute-1 sudo[227364]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:50 compute-1 ceph-mon[75484]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:57:50 compute-1 unix_chkpwd[227415]: password check failed for user (root)
Sep 30 17:57:50 compute-1 sshd-session[227157]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:57:51 compute-1 sshd-session[226571]: Failed password for invalid user k8s from 14.225.167.110 port 60454 ssh2
Sep 30 17:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:51 compute-1 sudo[227518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxwcyvonxoqrvpxfsjmfyaztyahxdrgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255070.8753147-2232-137968847331678/AnsiballZ_file.py'
Sep 30 17:57:51 compute-1 sudo[227518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:51 compute-1 python3.9[227520]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:51 compute-1 sudo[227518]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:51 compute-1 sudo[227670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxdedzxqhjlvpqtjzrtifshgowxwegny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255071.6087716-2232-31425998825919/AnsiballZ_file.py'
Sep 30 17:57:51 compute-1 sudo[227670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:52.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:52.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:52 compute-1 python3.9[227672]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:52 compute-1 sudo[227670]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:52 compute-1 sudo[227823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqbngbykwqfyvhbfyeqdykwfjiatpuve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255072.2718563-2232-219513317020747/AnsiballZ_file.py'
Sep 30 17:57:52 compute-1 sudo[227823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:52 compute-1 sshd-session[227157]: Failed password for root from 192.210.160.141 port 54268 ssh2
Sep 30 17:57:52 compute-1 ceph-mon[75484]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:57:52 compute-1 python3.9[227825]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:52 compute-1 sudo[227823]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:53 compute-1 sudo[227976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euaspqudaqlzitonbbodismvafheojxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255072.9638793-2232-253767817809081/AnsiballZ_file.py'
Sep 30 17:57:53 compute-1 sudo[227976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:53 compute-1 python3.9[227978]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:53 compute-1 sudo[227976]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:53 compute-1 sshd-session[226571]: Received disconnect from 14.225.167.110 port 60454:11: Bye Bye [preauth]
Sep 30 17:57:53 compute-1 sshd-session[226571]: Disconnected from invalid user k8s 14.225.167.110 port 60454 [preauth]
Sep 30 17:57:53 compute-1 sudo[228128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwuvadwtkowyovvlgjxcblgpekslpov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255073.6328158-2232-151720778170440/AnsiballZ_file.py'
Sep 30 17:57:53 compute-1 sudo[228128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:54.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:54.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:54 compute-1 python3.9[228130]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:54 compute-1 sshd-session[227157]: Connection closed by authenticating user root 192.210.160.141 port 54268 [preauth]
Sep 30 17:57:54 compute-1 sudo[228128]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:57:54.300 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:57:54.301 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:57:54.301 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:57:54 compute-1 unix_chkpwd[228195]: password check failed for user (root)
Sep 30 17:57:54 compute-1 sshd-session[228132]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 17:57:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:54 compute-1 sudo[228286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjztdfocsaxnvilsehlduuwddxiawve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255074.3338833-2232-274986529428052/AnsiballZ_file.py'
Sep 30 17:57:54 compute-1 sudo[228286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:54 compute-1 ceph-mon[75484]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:54 compute-1 python3.9[228288]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:54 compute-1 sudo[228286]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:55 compute-1 sudo[228438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aquqjfkihhqukxzypvnegzwzelhyzctt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255075.022648-2232-87573992173289/AnsiballZ_file.py'
Sep 30 17:57:55 compute-1 sudo[228438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:55 compute-1 python3.9[228440]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:55 compute-1 sudo[228438]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:56.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:56 compute-1 sudo[228591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbmjaujhkbsqqiipnpnxqppgxxqlseeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255075.7394655-2232-202315399633587/AnsiballZ_file.py'
Sep 30 17:57:56 compute-1 sudo[228591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:56.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:56 compute-1 python3.9[228593]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:57:56 compute-1 sudo[228591]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:56 compute-1 sshd-session[228132]: Failed password for root from 107.172.146.104 port 54726 ssh2
Sep 30 17:57:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:56 compute-1 ceph-mon[75484]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:56 compute-1 sudo[228744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzbimgovusphfquvhujwnymslrwjlark ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255076.5682428-2348-37010551859217/AnsiballZ_command.py'
Sep 30 17:57:56 compute-1 sudo[228744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:57 compute-1 sshd-session[228132]: Received disconnect from 107.172.146.104 port 54726:11: Bye Bye [preauth]
Sep 30 17:57:57 compute-1 sshd-session[228132]: Disconnected from authenticating user root 107.172.146.104 port 54726 [preauth]
Sep 30 17:57:57 compute-1 python3.9[228746]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:57:57 compute-1 sudo[228744]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:57:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:57:58.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:57:58 compute-1 python3.9[228898]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:57:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:57:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:57:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:57:58.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:57:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:57:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:58 compute-1 sudo[229050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixyflphqeofpeczdjsvwwghjgpseswjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255078.3385541-2384-240634103446904/AnsiballZ_systemd_service.py'
Sep 30 17:57:58 compute-1 sudo[229050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:57:58 compute-1 ceph-mon[75484]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:57:59 compute-1 python3.9[229052]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:57:59 compute-1 systemd[1]: Reloading.
Sep 30 17:57:59 compute-1 systemd-sysv-generator[229082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:57:59 compute-1 systemd-rc-local-generator[229078]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:59 compute-1 sudo[229050]: pam_unix(sudo:session): session closed for user root
Sep 30 17:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:57:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:57:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:57:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:57:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:57:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3744 writes, 19K keys, 3744 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s
                                           Cumulative WAL: 3744 writes, 3744 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1397 writes, 6789 keys, 1397 commit groups, 1.0 writes per commit group, ingest: 15.49 MB, 0.03 MB/s
                                           Interval WAL: 1397 writes, 1397 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    146.0      0.17              0.09         9    0.019       0      0       0.0       0.0
                                             L6      1/0   10.04 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    162.4    139.0      0.61              0.28         8    0.076     36K   4137       0.0       0.0
                                            Sum      1/0   10.04 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    127.6    140.5      0.78              0.36        17    0.046     36K   4137       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.5    137.3    135.5      0.35              0.19         8    0.044     20K   2381       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    162.4    139.0      0.61              0.28         8    0.076     36K   4137       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    148.0      0.16              0.09         8    0.021       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.024, interval 0.007
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.11 GB write, 0.09 MB/s write, 0.10 GB read, 0.08 MB/s read, 0.8 seconds
                                           Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 5.58 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000105 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(325,5.27 MB,1.73389%) FilterBlock(17,105.98 KB,0.0340462%) IndexBlock(17,206.86 KB,0.066451%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 17:57:59 compute-1 sudo[229238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfbyihaicqjhjuyrbqgvnklirkbwigi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255079.6155744-2400-49410545571207/AnsiballZ_command.py'
Sep 30 17:57:59 compute-1 sudo[229238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:00.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:00.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:00 compute-1 podman[229241]: 2025-09-30 17:58:00.128016801 +0000 UTC m=+0.116430285 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Sep 30 17:58:00 compute-1 python3.9[229242]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:00 compute-1 sudo[229238]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:00 compute-1 sudo[229415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgahvbtkiytljgrqulvkptrolycrznb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255080.3767498-2400-126024323566255/AnsiballZ_command.py'
Sep 30 17:58:00 compute-1 sudo[229415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:00 compute-1 ceph-mon[75484]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:58:00 compute-1 python3.9[229417]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:00 compute-1 sudo[229415]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:01 compute-1 sudo[229568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulfskcknbzxlqrclckxmpaivbgxyoabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255081.1375537-2400-256089062857277/AnsiballZ_command.py'
Sep 30 17:58:01 compute-1 sudo[229568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:01 compute-1 python3.9[229570]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:01 compute-1 sudo[229568]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:02.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:02.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:02 compute-1 sudo[229722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuvfuwmrrurrtxdmaauoigqeappejbdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255081.880706-2400-34303317514248/AnsiballZ_command.py'
Sep 30 17:58:02 compute-1 sudo[229722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:02 compute-1 python3.9[229724]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:02 compute-1 ceph-mon[75484]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:03 compute-1 sudo[229722]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:03 compute-1 sudo[229877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glokqgazxfnpntbgdoqizrfedxqrhgam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255083.6408787-2400-251326029569015/AnsiballZ_command.py'
Sep 30 17:58:03 compute-1 sudo[229877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:04.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:04.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:04 compute-1 python3.9[229879]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:04 compute-1 sudo[229877]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:04 compute-1 sudo[230031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btzieoutcqzeziofymijasyiqvfygpxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255084.4185643-2400-114285406888187/AnsiballZ_command.py'
Sep 30 17:58:04 compute-1 sudo[230031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:04 compute-1 sudo[230032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:58:04 compute-1 sudo[230032]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:04 compute-1 sudo[230032]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:04 compute-1 python3.9[230039]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:05 compute-1 sudo[230031]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:05 compute-1 ceph-mon[75484]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:05 compute-1 sudo[230209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isiprckhumjjjnizwyctfezmlvsxdgjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255085.178885-2400-233766641113743/AnsiballZ_command.py'
Sep 30 17:58:05 compute-1 sudo[230209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:05 compute-1 python3.9[230211]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:05 compute-1 sudo[230209]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:06.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:06.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:06 compute-1 sudo[230363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvcuykxtlibumrqslgudybkkbpputgfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255085.9682255-2400-39029018158057/AnsiballZ_command.py'
Sep 30 17:58:06 compute-1 sudo[230363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:06 compute-1 python3.9[230365]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:58:06 compute-1 sudo[230363]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:07 compute-1 ceph-mon[75484]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 17:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:07 compute-1 podman[230392]: 2025-09-30 17:58:07.556338995 +0000 UTC m=+0.095297095 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 17:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:07 compute-1 sudo[230537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viehacdzpkxjzscnvinbkbjxnthmrode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255087.5821848-2543-79983220914282/AnsiballZ_file.py'
Sep 30 17:58:07 compute-1 sudo[230537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:58:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:08.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:08.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:08 compute-1 python3.9[230540]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:08 compute-1 sudo[230537]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:08 compute-1 sudo[230691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbhkolzwspdjclkcsirutdyewewtztmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255088.4122624-2543-74553148009234/AnsiballZ_file.py'
Sep 30 17:58:08 compute-1 sudo[230691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:08 compute-1 python3.9[230693]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:09 compute-1 sudo[230691]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:09 compute-1 ceph-mon[75484]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:09 compute-1 sudo[230770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:58:09 compute-1 sudo[230770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:09 compute-1 sudo[230770]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:09 compute-1 sudo[230838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 17:58:09 compute-1 sudo[230838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:09 compute-1 podman[230820]: 2025-09-30 17:58:09.517591015 +0000 UTC m=+0.103759704 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 17:58:09 compute-1 sudo[230915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgnjjitclmwdlpfjzffvkengfsizrhvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255089.1883764-2543-65263024684647/AnsiballZ_file.py'
Sep 30 17:58:09 compute-1 sudo[230915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:09 compute-1 python3.9[230917]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:09 compute-1 sudo[230915]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:09 compute-1 sudo[230838]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:10 compute-1 sudo[230962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:58:10 compute-1 sudo[230962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:10 compute-1 sudo[230962]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:10.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:10.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:10 compute-1 sudo[231019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:58:10 compute-1 sudo[231019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:10 compute-1 sudo[231145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkkttxhnvbbbezlskugxkyybkjunohe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255090.0016613-2587-212830999941837/AnsiballZ_file.py'
Sep 30 17:58:10 compute-1 sudo[231145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:10 compute-1 sshd-session[230779]: Invalid user elk from 175.126.165.170 port 38824
Sep 30 17:58:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:10 compute-1 sshd-session[230779]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:10 compute-1 sshd-session[230779]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 17:58:10 compute-1 python3.9[231154]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:10 compute-1 sudo[231145]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:10 compute-1 ceph-mon[75484]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:58:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:10 compute-1 sudo[231019]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:11 compute-1 sudo[231324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzvhfqgvhnvnxptaxgvkfysngtuopopv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255090.8215442-2587-117514093226520/AnsiballZ_file.py'
Sep 30 17:58:11 compute-1 sudo[231324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e00019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:11 compute-1 python3.9[231326]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:11 compute-1 sudo[231324]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:58:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:58:11 compute-1 sudo[231476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piepfjlilhywczntteydwekrmolgfrde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255091.5762239-2587-126288361890534/AnsiballZ_file.py'
Sep 30 17:58:11 compute-1 sudo[231476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:12.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:12.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:12 compute-1 python3.9[231478]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:12 compute-1 sudo[231476]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:12 compute-1 sshd-session[230779]: Failed password for invalid user elk from 175.126.165.170 port 38824 ssh2
Sep 30 17:58:12 compute-1 sudo[231630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyczdtmiyfonckdcshlrcsavkbfahqni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255092.3107681-2587-264674755996869/AnsiballZ_file.py'
Sep 30 17:58:12 compute-1 sudo[231630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:12 compute-1 ceph-mon[75484]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:12 compute-1 python3.9[231632]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:12 compute-1 sudo[231630]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:13 compute-1 sudo[231784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqgyqaykmythyrcjaeqspnvbyxbyjclk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255093.1702325-2587-256357164118213/AnsiballZ_file.py'
Sep 30 17:58:13 compute-1 sudo[231784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:13 compute-1 python3.9[231786]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:13 compute-1 sudo[231784]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:14.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:14.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:14 compute-1 unix_chkpwd[231927]: password check failed for user (root)
Sep 30 17:58:14 compute-1 sshd-session[231633]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:58:14 compute-1 sudo[231938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciaanhbnhyijcafxbzsuwqenbamnacqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255093.967266-2587-252110763245443/AnsiballZ_file.py'
Sep 30 17:58:14 compute-1 sudo[231938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:14 compute-1 python3.9[231940]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:14 compute-1 sudo[231938]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:14 compute-1 sshd-session[230779]: Received disconnect from 175.126.165.170 port 38824:11: Bye Bye [preauth]
Sep 30 17:58:14 compute-1 sshd-session[230779]: Disconnected from invalid user elk 175.126.165.170 port 38824 [preauth]
Sep 30 17:58:15 compute-1 sudo[232091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnoedppxlamkfoinqsrynjdyuwduzubh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255094.7143877-2587-99229702521174/AnsiballZ_file.py'
Sep 30 17:58:15 compute-1 sudo[232091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:15 compute-1 ceph-mon[75484]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:15 compute-1 python3.9[232093]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:15 compute-1 sudo[232091]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:15 compute-1 sudo[232243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldrmdqxdjllzoghmrfikkyhcnenzllcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255095.4598181-2587-255936558669028/AnsiballZ_file.py'
Sep 30 17:58:15 compute-1 sudo[232243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:15 compute-1 sshd-session[231633]: Failed password for root from 192.210.160.141 port 47880 ssh2
Sep 30 17:58:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:16.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:16 compute-1 python3.9[232245]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:16.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:16 compute-1 sudo[232243]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:16 compute-1 ceph-mon[75484]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:16 compute-1 sudo[232396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agkygajpyunmuuehnkrgaepbtrmpguyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255096.2803686-2587-74037822734988/AnsiballZ_file.py'
Sep 30 17:58:16 compute-1 sudo[232396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:16 compute-1 python3.9[232399]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:16 compute-1 sudo[232396]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:17 compute-1 sshd-session[231633]: Connection closed by authenticating user root 192.210.160.141 port 47880 [preauth]
Sep 30 17:58:17 compute-1 sudo[232424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:58:17 compute-1 sudo[232424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:17 compute-1 sudo[232424]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:18.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:18.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:18 compute-1 ceph-mon[75484]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:20.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:20.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175820 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 17:58:20 compute-1 podman[232452]: 2025-09-30 17:58:20.620924239 +0000 UTC m=+0.148697322 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:58:20 compute-1 ceph-mon[75484]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 17:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:22.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:22.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:22 compute-1 sudo[232608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhpcpbyvvcxpfqubudzoinjrwdaunvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255101.8688445-2872-46810722004925/AnsiballZ_getent.py'
Sep 30 17:58:22 compute-1 sudo[232608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:22 compute-1 python3.9[232610]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Sep 30 17:58:22 compute-1 sudo[232608]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:22 compute-1 ceph-mon[75484]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:58:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:58:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 17:58:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 8144 writes, 32K keys, 8144 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 8144 writes, 1707 syncs, 4.77 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 699 writes, 1176 keys, 699 commit groups, 1.0 writes per commit group, ingest: 0.45 MB, 0.00 MB/s
                                           Interval WAL: 699 writes, 343 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f315989b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x556f31599350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Sep 30 17:58:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:23 compute-1 sudo[232762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojhlmykafafoetwufzxpltpucwvmysz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255102.78823-2888-63513085297366/AnsiballZ_group.py'
Sep 30 17:58:23 compute-1 sudo[232762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:23 compute-1 python3.9[232764]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:58:23 compute-1 groupadd[232765]: group added to /etc/group: name=nova, GID=42436
Sep 30 17:58:23 compute-1 groupadd[232765]: group added to /etc/gshadow: name=nova
Sep 30 17:58:23 compute-1 groupadd[232765]: new group: name=nova, GID=42436
Sep 30 17:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:23 compute-1 sudo[232762]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:24.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:24.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:24 compute-1 sudo[232921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzuswbdrzxpbfzdbpwitcvxkmtfkijet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255103.83393-2904-269976549357302/AnsiballZ_user.py'
Sep 30 17:58:24 compute-1 sudo[232921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:24 compute-1 python3.9[232923]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 17:58:24 compute-1 useradd[232926]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Sep 30 17:58:24 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:58:24 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:58:24 compute-1 useradd[232926]: add 'nova' to group 'libvirt'
Sep 30 17:58:24 compute-1 useradd[232926]: add 'nova' to shadow group 'libvirt'
Sep 30 17:58:24 compute-1 ceph-mon[75484]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:58:24 compute-1 sudo[232921]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:24 compute-1 sudo[232934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:58:24 compute-1 sudo[232934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:24 compute-1 sudo[232934]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:25 compute-1 sshd-session[232983]: Accepted publickey for zuul from 192.168.122.30 port 57442 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:58:25 compute-1 systemd-logind[789]: New session 56 of user zuul.
Sep 30 17:58:25 compute-1 systemd[1]: Started Session 56 of User zuul.
Sep 30 17:58:25 compute-1 sshd-session[232983]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:58:25 compute-1 sshd-session[232986]: Received disconnect from 192.168.122.30 port 57442:11: disconnected by user
Sep 30 17:58:25 compute-1 sshd-session[232986]: Disconnected from user zuul 192.168.122.30 port 57442
Sep 30 17:58:25 compute-1 sshd-session[232983]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:58:25 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Sep 30 17:58:25 compute-1 systemd-logind[789]: Session 56 logged out. Waiting for processes to exit.
Sep 30 17:58:25 compute-1 systemd-logind[789]: Removed session 56.
Sep 30 17:58:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:58:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:26.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:58:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:26.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:26 compute-1 python3.9[233137]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:26 compute-1 ceph-mon[75484]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:58:27 compute-1 python3.9[233259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255106.1084342-2954-79245966067066/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:27 compute-1 python3.9[233411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:28.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:28 compute-1 sshd-session[233277]: Invalid user sp from 84.51.43.58 port 61262
Sep 30 17:58:28 compute-1 sshd-session[233277]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:28 compute-1 sshd-session[233277]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 17:58:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:28 compute-1 python3.9[233488]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:28 compute-1 ceph-mon[75484]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 17:58:29 compute-1 python3.9[233641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 17:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:29 compute-1 sshd-session[233489]: Invalid user titu from 194.107.115.65 port 44554
Sep 30 17:58:29 compute-1 sshd-session[233489]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:29 compute-1 sshd-session[233489]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 17:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:29 compute-1 python3.9[233762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255108.5706408-2954-186770673371831/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:30.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:30 compute-1 sshd-session[233277]: Failed password for invalid user sp from 84.51.43.58 port 61262 ssh2
Sep 30 17:58:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:30.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:30 compute-1 podman[233887]: 2025-09-30 17:58:30.385606413 +0000 UTC m=+0.089475695 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 17:58:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:30 compute-1 python3.9[233925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:30 compute-1 ceph-mon[75484]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:58:30 compute-1 sshd-session[233277]: Received disconnect from 84.51.43.58 port 61262:11: Bye Bye [preauth]
Sep 30 17:58:30 compute-1 sshd-session[233277]: Disconnected from invalid user sp 84.51.43.58 port 61262 [preauth]
Sep 30 17:58:31 compute-1 python3.9[234054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255109.9703228-2954-31256800289668/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:31 compute-1 sshd-session[233489]: Failed password for invalid user titu from 194.107.115.65 port 44554 ssh2
Sep 30 17:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:31 compute-1 python3.9[234206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:32.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:32.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:32 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 17:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:32 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 17:58:32 compute-1 sshd-session[233489]: Received disconnect from 194.107.115.65 port 44554:11: Bye Bye [preauth]
Sep 30 17:58:32 compute-1 sshd-session[233489]: Disconnected from invalid user titu 194.107.115.65 port 44554 [preauth]
Sep 30 17:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:32 compute-1 python3.9[234328]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255111.3189526-2954-163015368862735/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:32 compute-1 ceph-mon[75484]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:58:33 compute-1 sudo[234479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceqmhmhnydjacgppnjsrbwownehzrtub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255112.7248383-3092-43639646109853/AnsiballZ_file.py'
Sep 30 17:58:33 compute-1 sudo[234479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:33 compute-1 python3.9[234481]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:58:33 compute-1 sudo[234479]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:33 compute-1 sudo[234631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqosjzyomrsipzfimtuimywbqudsqzbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255113.5359368-3108-148766254324789/AnsiballZ_copy.py'
Sep 30 17:58:33 compute-1 sudo[234631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:34.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:34.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:34 compute-1 python3.9[234633]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:58:34 compute-1 sudo[234631]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:34 compute-1 sudo[234785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owwedfhynemevokwacscxnfvetywrxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255114.3955193-3125-273928140462397/AnsiballZ_stat.py'
Sep 30 17:58:34 compute-1 sudo[234785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:34 compute-1 ceph-mon[75484]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:58:34 compute-1 python3.9[234787]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:58:34 compute-1 sudo[234785]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 17:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:35 compute-1 sudo[234938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rutkivvfrtcuesktbttymagewfwzqixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255115.1362767-3140-108386146693678/AnsiballZ_stat.py'
Sep 30 17:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:35 compute-1 sudo[234938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:35 compute-1 python3.9[234940]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:35 compute-1 sudo[234938]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:36.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:36 compute-1 sudo[235063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zysthxelzdiykkxrjcwsurrvgkbtxndx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255115.1362767-3140-108386146693678/AnsiballZ_copy.py'
Sep 30 17:58:36 compute-1 sudo[235063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:36 compute-1 python3.9[235065]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759255115.1362767-3140-108386146693678/.source _original_basename=.lgjgg1s9 follow=False checksum=fe08feae0672a40acefd238f345d9f9c5d39f11c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Sep 30 17:58:36 compute-1 sudo[235063]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:36 compute-1 unix_chkpwd[235092]: password check failed for user (root)
Sep 30 17:58:36 compute-1 sshd-session[234811]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:58:36 compute-1 ceph-mon[75484]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:58:37 compute-1 python3.9[235219]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:37 compute-1 podman[235345]: 2025-09-30 17:58:37.736672068 +0000 UTC m=+0.083935037 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 17:58:37 compute-1 python3.9[235382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:58:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:38.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:38.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:38 compute-1 python3.9[235514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255117.3645463-3192-84512801511053/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=b27833676660fa98c54003ae3ee408ee2eef3f6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:38 compute-1 sshd-session[234811]: Failed password for root from 192.210.160.141 port 46680 ssh2
Sep 30 17:58:38 compute-1 ceph-mon[75484]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:39 compute-1 python3.9[235665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:39 compute-1 sshd-session[234811]: Connection closed by authenticating user root 192.210.160.141 port 46680 [preauth]
Sep 30 17:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:39 compute-1 podman[235762]: 2025-09-30 17:58:39.789126117 +0000 UTC m=+0.094597894 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:58:39 compute-1 python3.9[235801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255118.7223258-3223-190810027581709/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0d7080b27a2b16032bc39b6298de5bdc4fff259e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:58:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:40.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/175840 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 17:58:40 compute-1 sudo[235961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weulyughkfaaossrgdmzhksovcgmspwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255120.385046-3256-168294320941257/AnsiballZ_container_config_data.py'
Sep 30 17:58:40 compute-1 sudo[235961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:40 compute-1 python3.9[235963]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Sep 30 17:58:41 compute-1 sudo[235961]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:41 compute-1 ceph-mon[75484]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 17:58:41 compute-1 unix_chkpwd[235976]: password check failed for user (root)
Sep 30 17:58:41 compute-1 sshd-session[235748]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 17:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:41 compute-1 sudo[236114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjybuhtxlneqiqxwayipmohywfvfdxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255121.3471467-3274-221793349314322/AnsiballZ_container_config_hash.py'
Sep 30 17:58:41 compute-1 sudo[236114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:41 compute-1 python3.9[236116]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:58:41 compute-1 sudo[236114]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:42.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:42.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:42 compute-1 sudo[236268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgrxxvnfykchvjepiqptolcsptbklbbv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255122.323727-3294-259551444978746/AnsiballZ_edpm_container_manage.py'
Sep 30 17:58:42 compute-1 sudo[236268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:42 compute-1 sshd-session[235748]: Failed password for root from 103.153.190.105 port 49991 ssh2
Sep 30 17:58:42 compute-1 python3[236270]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:58:43 compute-1 ceph-mon[75484]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:58:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:44 compute-1 sshd-session[235748]: Received disconnect from 103.153.190.105 port 49991:11: Bye Bye [preauth]
Sep 30 17:58:44 compute-1 sshd-session[235748]: Disconnected from authenticating user root 103.153.190.105 port 49991 [preauth]
Sep 30 17:58:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:44.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:44.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:45 compute-1 sudo[236314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:58:45 compute-1 ceph-mon[75484]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 17:58:45 compute-1 sudo[236314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:58:45 compute-1 sudo[236314]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:45 compute-1 sshd-session[236309]: Invalid user vastbase from 216.10.242.161 port 54084
Sep 30 17:58:45 compute-1 sshd-session[236309]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:45 compute-1 sshd-session[236309]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 17:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:46.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:58:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:46.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:58:46 compute-1 ceph-mon[75484]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:58:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:47 compute-1 sshd-session[236309]: Failed password for invalid user vastbase from 216.10.242.161 port 54084 ssh2
Sep 30 17:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:47 compute-1 sshd-session[236309]: Received disconnect from 216.10.242.161 port 54084:11: Bye Bye [preauth]
Sep 30 17:58:47 compute-1 sshd-session[236309]: Disconnected from invalid user vastbase 216.10.242.161 port 54084 [preauth]
Sep 30 17:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:48.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:48.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:48 compute-1 ceph-mon[75484]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:50 compute-1 sshd-session[236378]: Invalid user iptv from 107.172.146.104 port 43292
Sep 30 17:58:50 compute-1 sshd-session[236378]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:50 compute-1 sshd-session[236378]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:58:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:58:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:50.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:58:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:50.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:51 compute-1 sshd-session[236378]: Failed password for invalid user iptv from 107.172.146.104 port 43292 ssh2
Sep 30 17:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:51 compute-1 ceph-mon[75484]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 17:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:52.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:52.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:52 compute-1 sshd-session[236378]: Received disconnect from 107.172.146.104 port 43292:11: Bye Bye [preauth]
Sep 30 17:58:52 compute-1 sshd-session[236378]: Disconnected from invalid user iptv 107.172.146.104 port 43292 [preauth]
Sep 30 17:58:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:53 compute-1 podman[236382]: 2025-09-30 17:58:53.269389129 +0000 UTC m=+1.819201651 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 17:58:53 compute-1 podman[236285]: 2025-09-30 17:58:53.293781434 +0000 UTC m=+10.219965044 image pull d136a586f9f7c346565dba6e8dc081bc2663ef9baa7df2145dd739dc20978132 38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 17:58:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:53 compute-1 podman[236434]: 2025-09-30 17:58:53.534709779 +0000 UTC m=+0.078818429 container create e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init)
Sep 30 17:58:53 compute-1 podman[236434]: 2025-09-30 17:58:53.494345714 +0000 UTC m=+0.038454414 image pull d136a586f9f7c346565dba6e8dc081bc2663ef9baa7df2145dd739dc20978132 38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 17:58:53 compute-1 python3[236270]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Sep 30 17:58:53 compute-1 ceph-mon[75484]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:53 compute-1 sudo[236268]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:54.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:54 compute-1 sshd-session[236396]: Invalid user github from 14.225.167.110 port 57608
Sep 30 17:58:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:54.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:54 compute-1 sshd-session[236396]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:54 compute-1 sshd-session[236396]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 17:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:58:54.302 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:58:54.303 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:58:54.303 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:58:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:54 compute-1 sudo[236625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnzzawtuwfpipegpqrgyjwpjbtzjtyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255134.4214907-3310-244447618721036/AnsiballZ_stat.py'
Sep 30 17:58:54 compute-1 sudo[236625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:54 compute-1 ceph-mon[75484]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:54 compute-1 python3.9[236627]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:58:55 compute-1 sudo[236625]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:55 compute-1 sudo[236779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbrmuwykvvmlbxzxqsesjurpfjxjfypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255135.490812-3334-154104272973090/AnsiballZ_container_config_data.py'
Sep 30 17:58:55 compute-1 sudo[236779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:56 compute-1 python3.9[236781]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Sep 30 17:58:56 compute-1 sudo[236779]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:56.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:56.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:56 compute-1 sshd-session[236396]: Failed password for invalid user github from 14.225.167.110 port 57608 ssh2
Sep 30 17:58:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:56 compute-1 sudo[236933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqftickakerdmpsffmgtqanrskqcbbeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255136.3513134-3352-211506775436658/AnsiballZ_container_config_hash.py'
Sep 30 17:58:56 compute-1 sudo[236933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:56 compute-1 ceph-mon[75484]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:56 compute-1 python3.9[236935]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 17:58:56 compute-1 sudo[236933]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:57 compute-1 sudo[237085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfkuaqxkwjbjghaxswbggdirmnwcupmr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255137.3178685-3372-200991896658307/AnsiballZ_edpm_container_manage.py'
Sep 30 17:58:57 compute-1 sudo[237085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:57 compute-1 python3[237087]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 17:58:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:58:58.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:58:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:58:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:58:58.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:58:58 compute-1 podman[237128]: 2025-09-30 17:58:58.245947969 +0000 UTC m=+0.088006016 container create 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:58:58 compute-1 podman[237128]: 2025-09-30 17:58:58.202694886 +0000 UTC m=+0.044752993 image pull d136a586f9f7c346565dba6e8dc081bc2663ef9baa7df2145dd739dc20978132 38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 17:58:58 compute-1 sshd-session[236396]: Received disconnect from 14.225.167.110 port 57608:11: Bye Bye [preauth]
Sep 30 17:58:58 compute-1 sshd-session[236396]: Disconnected from invalid user github 14.225.167.110 port 57608 [preauth]
Sep 30 17:58:58 compute-1 python3[237087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Sep 30 17:58:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:58:58 compute-1 sudo[237085]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:58 compute-1 ceph-mon[75484]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:58:59 compute-1 sudo[237317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sspflcytnmbthbxllibhbnbyjmumfwqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255138.8496358-3388-73438180719622/AnsiballZ_stat.py'
Sep 30 17:58:59 compute-1 sudo[237317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:59 compute-1 python3.9[237319]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:58:59 compute-1 sudo[237317]: pam_unix(sudo:session): session closed for user root
Sep 30 17:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:58:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:58:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:58:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80025f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:58:59 compute-1 sshd-session[237088]: Invalid user student from 192.210.160.141 port 40952
Sep 30 17:58:59 compute-1 sshd-session[237088]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:58:59 compute-1 sshd-session[237088]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 17:59:00 compute-1 sudo[237472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pueltgyhiofqiqhnenoewfiafttrhith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255139.7239923-3406-40601336876355/AnsiballZ_file.py'
Sep 30 17:59:00 compute-1 sudo[237472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:00.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:00.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:00 compute-1 python3.9[237474]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:00 compute-1 sudo[237472]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:00 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:00 compute-1 podman[237527]: 2025-09-30 17:59:00.576130232 +0000 UTC m=+0.113797490 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 17:59:00 compute-1 sudo[237644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htiwpshnhscvlxywevuqnkhejjqbcigz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255140.3426757-3406-129112065211805/AnsiballZ_copy.py'
Sep 30 17:59:00 compute-1 sudo[237644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:00 compute-1 ceph-mon[75484]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:59:01 compute-1 python3.9[237646]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759255140.3426757-3406-129112065211805/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:01 compute-1 sudo[237644]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:01 compute-1 sudo[237720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqvrvjhzzrulglxnqubjdysbfeuvxxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255140.3426757-3406-129112065211805/AnsiballZ_systemd.py'
Sep 30 17:59:01 compute-1 sudo[237720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:01 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:01 compute-1 python3.9[237722]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:59:01 compute-1 systemd[1]: Reloading.
Sep 30 17:59:01 compute-1 systemd-rc-local-generator[237748]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:59:01 compute-1 systemd-sysv-generator[237753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:59:02 compute-1 sshd-session[237088]: Failed password for invalid user student from 192.210.160.141 port 40952 ssh2
Sep 30 17:59:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:02.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:02.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:02 compute-1 sudo[237720]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:02 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:02 compute-1 sudo[237832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urnlwhbjpstttrlcelmlubalwvgzuzty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255140.3426757-3406-129112065211805/AnsiballZ_systemd.py'
Sep 30 17:59:02 compute-1 sudo[237832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:02 compute-1 python3.9[237834]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:59:03 compute-1 ceph-mon[75484]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:03 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80025f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:03 compute-1 systemd[1]: Reloading.
Sep 30 17:59:04 compute-1 systemd-sysv-generator[237869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:59:04 compute-1 systemd-rc-local-generator[237865]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:59:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:04.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:04 compute-1 ceph-mon[75484]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:04.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:04 compute-1 systemd[1]: Starting nova_compute container...
Sep 30 17:59:04 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:59:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:04 compute-1 podman[237876]: 2025-09-30 17:59:04.419283843 +0000 UTC m=+0.121968258 container init 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Sep 30 17:59:04 compute-1 podman[237876]: 2025-09-30 17:59:04.429920449 +0000 UTC m=+0.132604824 container start 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 17:59:04 compute-1 nova_compute[237891]: + sudo -E kolla_set_configs
Sep 30 17:59:04 compute-1 podman[237876]: nova_compute
Sep 30 17:59:04 compute-1 systemd[1]: Started nova_compute container.
Sep 30 17:59:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:04 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:04 compute-1 sudo[237832]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Validating config file
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying service configuration files
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Deleting /etc/ceph
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Creating directory /etc/ceph
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Writing out command to execute
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:04 compute-1 nova_compute[237891]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 17:59:04 compute-1 nova_compute[237891]: ++ cat /run_command
Sep 30 17:59:04 compute-1 nova_compute[237891]: + CMD=nova-compute
Sep 30 17:59:04 compute-1 nova_compute[237891]: + ARGS=
Sep 30 17:59:04 compute-1 nova_compute[237891]: + sudo kolla_copy_cacerts
Sep 30 17:59:04 compute-1 nova_compute[237891]: + [[ ! -n '' ]]
Sep 30 17:59:04 compute-1 nova_compute[237891]: + . kolla_extend_start
Sep 30 17:59:04 compute-1 nova_compute[237891]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 17:59:04 compute-1 nova_compute[237891]: Running command: 'nova-compute'
Sep 30 17:59:04 compute-1 nova_compute[237891]: + umask 0022
Sep 30 17:59:04 compute-1 nova_compute[237891]: + exec nova-compute
Sep 30 17:59:04 compute-1 sshd-session[237088]: Connection closed by invalid user student 192.210.160.141 port 40952 [preauth]
Sep 30 17:59:05 compute-1 sudo[237965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:59:05 compute-1 sudo[237965]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:05 compute-1 sudo[237965]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:05 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:05 compute-1 python3.9[238078]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:06.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:06.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:06 compute-1 python3.9[238229]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:06 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:06 compute-1 ceph-mon[75484]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:06 compute-1 nova_compute[237891]: 2025-09-30 17:59:06.879 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:06 compute-1 nova_compute[237891]: 2025-09-30 17:59:06.879 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:06 compute-1 nova_compute[237891]: 2025-09-30 17:59:06.880 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:06 compute-1 nova_compute[237891]: 2025-09-30 17:59:06.880 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 17:59:06 compute-1 nova_compute[237891]: 2025-09-30 17:59:06.990 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 17:59:07 compute-1 nova_compute[237891]: 2025-09-30 17:59:07.019 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 17:59:07 compute-1 nova_compute[237891]: 2025-09-30 17:59:07.064 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 17:59:07 compute-1 nova_compute[237891]: 2025-09-30 17:59:07.066 2 WARNING oslo_config.cfg [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 17:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:07 compute-1 python3.9[238383]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:07 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80025f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:59:08 compute-1 sudo[238547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deldkmjfteuxgjvnijwqwadfewhyzgss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255147.673718-3526-141907259784235/AnsiballZ_podman_container.py'
Sep 30 17:59:08 compute-1 sudo[238547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:08 compute-1 podman[238507]: 2025-09-30 17:59:08.040041569 +0000 UTC m=+0.065146832 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.075 2 INFO nova.virt.driver [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 17:59:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:08.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:08.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.177 2 INFO nova.compute.provider_config [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 17:59:08 compute-1 python3.9[238556]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 17:59:08 compute-1 sudo[238547]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:08 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:59:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:08 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.684 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.685 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.685 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.686 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.686 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.687 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.687 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.688 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.688 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.688 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.688 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.689 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.689 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.689 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.690 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.690 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.690 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.691 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.691 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.691 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.692 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.692 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.692 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.693 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.693 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.693 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.693 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.694 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.694 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.694 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.695 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.695 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.695 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.696 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.696 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.696 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.697 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.697 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.697 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.698 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.698 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.698 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.698 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.699 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.699 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.699 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.700 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.700 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.700 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.701 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.701 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.701 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.701 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.702 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.702 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.702 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.703 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.703 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.703 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.704 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.704 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.704 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.705 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.705 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.705 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.705 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.706 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.706 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.706 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.707 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.707 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.707 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.707 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.708 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.708 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.708 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.709 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.709 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.709 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.710 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.710 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.711 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.711 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.711 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.712 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.712 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.712 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.712 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.713 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.713 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.713 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.714 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.714 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.715 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.715 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.715 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.716 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.716 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.716 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.717 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.717 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.717 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.718 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.718 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.718 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.719 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.719 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.719 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.720 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.720 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.720 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.720 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.721 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.721 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.721 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.721 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.722 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.722 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.722 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.723 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.723 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.724 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.724 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.724 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.725 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.725 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.725 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.725 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.726 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.726 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.726 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.726 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.727 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.728 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.729 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.729 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.729 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.729 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.729 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.730 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.730 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.730 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.730 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.731 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.731 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.731 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.731 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.732 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.732 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.732 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.732 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.733 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.734 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.735 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.735 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.735 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.735 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.735 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.736 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.737 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.737 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.737 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.737 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.737 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.738 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.739 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.739 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.739 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.739 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.740 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.741 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.741 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.741 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.741 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.741 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.742 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.742 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.742 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.742 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.742 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.743 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.744 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.745 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.745 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.745 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.745 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.745 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.746 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.747 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.748 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.748 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.748 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.748 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.748 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.749 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.749 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.749 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.749 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.751 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.751 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.751 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.752 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.752 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.752 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.752 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.752 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.753 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.754 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.755 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.756 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.757 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.758 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.759 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.759 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.759 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.759 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.759 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.760 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.761 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.762 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.763 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.764 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.765 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.766 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.767 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.768 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.769 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.770 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.771 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.772 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.773 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.774 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.775 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.776 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.777 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.778 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.779 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.780 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.781 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.782 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.783 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.784 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.785 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.786 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.787 2 WARNING oslo_config.cfg [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 17:59:08 compute-1 nova_compute[237891]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 17:59:08 compute-1 nova_compute[237891]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 17:59:08 compute-1 nova_compute[237891]: and ``live_migration_inbound_addr`` respectively.
Sep 30 17:59:08 compute-1 nova_compute[237891]: ).  Its value may be silently ignored in the future.
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.788 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.789 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rbd_secret_uuid        = 63d32c6a-fa18-54ed-8711-9a3915cc367b log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.790 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.791 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.792 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.793 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.794 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.795 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.796 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.797 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.798 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.799 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.800 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.801 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.802 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.803 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.804 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.805 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.806 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.807 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.808 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.809 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.810 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.811 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.812 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.812 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.812 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.812 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.812 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.813 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.814 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.815 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.815 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.815 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.815 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.815 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.816 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.817 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.818 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.819 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.820 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.821 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.822 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.823 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.824 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.824 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.824 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.824 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.824 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.825 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.826 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.827 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.828 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.829 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.830 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.831 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.832 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.833 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.834 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.835 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.836 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.837 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.838 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.839 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.840 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.841 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.842 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.843 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.844 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.845 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.846 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.847 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.848 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.849 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.850 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.850 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.850 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.850 2 DEBUG oslo_service.backend._eventlet.service [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 17:59:08 compute-1 nova_compute[237891]: 2025-09-30 17:59:08.851 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 17:59:08 compute-1 ceph-mon[75484]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:08 compute-1 sudo[238732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixjpipdhgmgrbyoejztjkkeodfzpvom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255148.6022513-3542-182206839608171/AnsiballZ_systemd.py'
Sep 30 17:59:08 compute-1 sudo[238732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:09 compute-1 python3.9[238734]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 17:59:09 compute-1 systemd[1]: Stopping nova_compute container...
Sep 30 17:59:09 compute-1 nova_compute[237891]: 2025-09-30 17:59:09.383 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 17:59:09 compute-1 nova_compute[237891]: 2025-09-30 17:59:09.383 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 17:59:09 compute-1 nova_compute[237891]: 2025-09-30 17:59:09.383 2 DEBUG oslo_concurrency.lockutils [None req-ebbe6546-8d1f-43f8-8893-e059f2e341a6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 17:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:09 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:09 compute-1 systemd[1]: libpod-060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3.scope: Deactivated successfully.
Sep 30 17:59:09 compute-1 systemd[1]: libpod-060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3.scope: Consumed 3.130s CPU time.
Sep 30 17:59:09 compute-1 podman[238738]: 2025-09-30 17:59:09.855853117 +0000 UTC m=+0.529346287 container died 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 17:59:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3-userdata-shm.mount: Deactivated successfully.
Sep 30 17:59:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73-merged.mount: Deactivated successfully.
Sep 30 17:59:09 compute-1 podman[238755]: 2025-09-30 17:59:09.996372733 +0000 UTC m=+0.079933109 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 17:59:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:10.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:10.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:10 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:10 compute-1 podman[238738]: 2025-09-30 17:59:10.691470074 +0000 UTC m=+1.364963234 container cleanup 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 17:59:10 compute-1 podman[238738]: nova_compute
Sep 30 17:59:10 compute-1 podman[238793]: nova_compute
Sep 30 17:59:10 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Sep 30 17:59:10 compute-1 systemd[1]: Stopped nova_compute container.
Sep 30 17:59:10 compute-1 systemd[1]: Starting nova_compute container...
Sep 30 17:59:10 compute-1 ceph-mon[75484]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:59:10 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:59:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623ab6aebda747cb89473f84c2c951867130649d95df8457277af5e8d2faab73/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:10 compute-1 podman[238806]: 2025-09-30 17:59:10.927443146 +0000 UTC m=+0.118276760 container init 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 17:59:10 compute-1 podman[238806]: 2025-09-30 17:59:10.932564363 +0000 UTC m=+0.123397967 container start 060e21a34883e64f02111b530dae012732c6d642ed59ab56e4411a9c26437ee3 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm)
Sep 30 17:59:10 compute-1 nova_compute[238822]: + sudo -E kolla_set_configs
Sep 30 17:59:10 compute-1 podman[238806]: nova_compute
Sep 30 17:59:10 compute-1 systemd[1]: Started nova_compute container.
Sep 30 17:59:10 compute-1 sudo[238732]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Validating config file
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying service configuration files
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /etc/ceph
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Creating directory /etc/ceph
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Writing out command to execute
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:11 compute-1 nova_compute[238822]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 17:59:11 compute-1 nova_compute[238822]: ++ cat /run_command
Sep 30 17:59:11 compute-1 nova_compute[238822]: + CMD=nova-compute
Sep 30 17:59:11 compute-1 nova_compute[238822]: + ARGS=
Sep 30 17:59:11 compute-1 nova_compute[238822]: + sudo kolla_copy_cacerts
Sep 30 17:59:11 compute-1 nova_compute[238822]: + [[ ! -n '' ]]
Sep 30 17:59:11 compute-1 nova_compute[238822]: + . kolla_extend_start
Sep 30 17:59:11 compute-1 nova_compute[238822]: Running command: 'nova-compute'
Sep 30 17:59:11 compute-1 nova_compute[238822]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 17:59:11 compute-1 nova_compute[238822]: + umask 0022
Sep 30 17:59:11 compute-1 nova_compute[238822]: + exec nova-compute
Sep 30 17:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:11 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:11 compute-1 sshd-session[238788]: Invalid user geoserver from 175.126.165.170 port 41714
Sep 30 17:59:11 compute-1 sshd-session[238788]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:11 compute-1 sshd-session[238788]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 17:59:11 compute-1 sudo[238983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilmkwoeuivfulfvjxuojmxrpdscmzipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255151.2258883-3560-188550312395298/AnsiballZ_podman_container.py'
Sep 30 17:59:11 compute-1 sudo[238983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f8003a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:11 compute-1 python3.9[238985]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 17:59:11 compute-1 systemd[1]: Started libpod-conmon-e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172.scope.
Sep 30 17:59:12 compute-1 systemd[1]: Started libcrun container.
Sep 30 17:59:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af9a0b4563a57671cb5837f09188f305f5dee6c27d795d9f84054dcd34aaa65b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af9a0b4563a57671cb5837f09188f305f5dee6c27d795d9f84054dcd34aaa65b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af9a0b4563a57671cb5837f09188f305f5dee6c27d795d9f84054dcd34aaa65b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Sep 30 17:59:12 compute-1 podman[239010]: 2025-09-30 17:59:12.067934226 +0000 UTC m=+0.161041689 container init e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:59:12 compute-1 podman[239010]: 2025-09-30 17:59:12.080690389 +0000 UTC m=+0.173797812 container start e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:59:12 compute-1 python3.9[238985]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Sep 30 17:59:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:12.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:12.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Applying nova statedir ownership
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Sep 30 17:59:12 compute-1 nova_compute_init[239032]: INFO:nova_statedir:Nova statedir ownership complete
Sep 30 17:59:12 compute-1 systemd[1]: libpod-e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172.scope: Deactivated successfully.
Sep 30 17:59:12 compute-1 podman[239046]: 2025-09-30 17:59:12.225311465 +0000 UTC m=+0.033894762 container died e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 17:59:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172-userdata-shm.mount: Deactivated successfully.
Sep 30 17:59:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-af9a0b4563a57671cb5837f09188f305f5dee6c27d795d9f84054dcd34aaa65b-merged.mount: Deactivated successfully.
Sep 30 17:59:12 compute-1 podman[239046]: 2025-09-30 17:59:12.280256432 +0000 UTC m=+0.088839639 container cleanup e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172 (image=38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.129.56.221:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 17:59:12 compute-1 systemd[1]: libpod-conmon-e2a17203f91b0f1dea59f831d3d5248c781ab80afa0cc63f54a8cd4f2a22d172.scope: Deactivated successfully.
Sep 30 17:59:12 compute-1 sudo[238983]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:12 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:12 compute-1 sshd-session[203050]: Connection closed by 192.168.122.30 port 44898
Sep 30 17:59:12 compute-1 sshd-session[203047]: pam_unix(sshd:session): session closed for user zuul
Sep 30 17:59:12 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Sep 30 17:59:12 compute-1 systemd-logind[789]: Session 54 logged out. Waiting for processes to exit.
Sep 30 17:59:12 compute-1 systemd[1]: session-54.scope: Consumed 3min 18.058s CPU time.
Sep 30 17:59:12 compute-1 systemd-logind[789]: Removed session 54.
Sep 30 17:59:12 compute-1 nova_compute[238822]: 2025-09-30 17:59:12.895 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:12 compute-1 nova_compute[238822]: 2025-09-30 17:59:12.895 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:12 compute-1 nova_compute[238822]: 2025-09-30 17:59:12.895 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 17:59:12 compute-1 nova_compute[238822]: 2025-09-30 17:59:12.895 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 17:59:12 compute-1 ceph-mon[75484]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:13 compute-1 nova_compute[238822]: 2025-09-30 17:59:13.014 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 17:59:13 compute-1 nova_compute[238822]: 2025-09-30 17:59:13.026 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 17:59:13 compute-1 nova_compute[238822]: 2025-09-30 17:59:13.055 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 17:59:13 compute-1 nova_compute[238822]: 2025-09-30 17:59:13.057 2 WARNING oslo_config.cfg [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 17:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:13 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:13 compute-1 sshd-session[238788]: Failed password for invalid user geoserver from 175.126.165.170 port 41714 ssh2
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.043 2 INFO nova.virt.driver [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 17:59:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:14.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.133 2 INFO nova.compute.provider_config [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 17:59:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:14.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:14 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.638 2 DEBUG oslo_concurrency.lockutils [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.638 2 DEBUG oslo_concurrency.lockutils [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.639 2 DEBUG oslo_concurrency.lockutils [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.639 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.639 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.639 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.639 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.640 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.641 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.642 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.643 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.644 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.645 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.646 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.647 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.648 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.649 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.650 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.651 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.652 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.653 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.654 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.655 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.656 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.657 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.658 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.659 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.660 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.661 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.662 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.663 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.664 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.665 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.666 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.667 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.667 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.667 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.667 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.667 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.668 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.669 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.670 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.671 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.672 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.673 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.674 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.675 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.676 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.677 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.678 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.679 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.680 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.681 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.682 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.683 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.684 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.685 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.686 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.687 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.688 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.689 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.689 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.689 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.690 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.691 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.692 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.693 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.694 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.695 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.696 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.697 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.698 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.699 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.700 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.701 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.702 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.703 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.704 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.705 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.706 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.707 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.708 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.709 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.710 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.712 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.712 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.713 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.713 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.713 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.714 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.714 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.714 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.715 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.715 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.716 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.716 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.716 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.717 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.717 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.717 2 WARNING oslo_config.cfg [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 17:59:14 compute-1 nova_compute[238822]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 17:59:14 compute-1 nova_compute[238822]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 17:59:14 compute-1 nova_compute[238822]: and ``live_migration_inbound_addr`` respectively.
Sep 30 17:59:14 compute-1 nova_compute[238822]: ).  Its value may be silently ignored in the future.
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.718 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.718 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.719 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.719 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.719 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.720 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.720 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.721 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.721 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.722 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.722 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.722 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.723 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.723 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.723 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.724 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.724 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.724 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.724 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.725 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rbd_secret_uuid        = 63d32c6a-fa18-54ed-8711-9a3915cc367b log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.725 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.725 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.726 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.726 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.726 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.727 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.727 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.728 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.728 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.728 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.729 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.729 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.730 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.730 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.730 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.731 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.731 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.731 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.731 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.732 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.732 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.732 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.733 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.733 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.733 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.734 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.734 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.734 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.735 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.735 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.735 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.736 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.736 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.736 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.737 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.737 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.737 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.738 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.738 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.738 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.739 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.739 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.739 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.739 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.740 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.740 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.740 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.741 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.741 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.741 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.741 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.742 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.742 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.742 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.742 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.743 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.743 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.743 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.744 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.744 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.744 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.745 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.745 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.745 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.745 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.746 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.746 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.746 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.747 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.747 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.747 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.748 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.748 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.748 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.749 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.749 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.749 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.749 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.750 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.750 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.750 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.751 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.751 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.751 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.751 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.752 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.752 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.752 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 sshd-session[238788]: Received disconnect from 175.126.165.170 port 41714:11: Bye Bye [preauth]
Sep 30 17:59:14 compute-1 sshd-session[238788]: Disconnected from invalid user geoserver 175.126.165.170 port 41714 [preauth]
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.753 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.754 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.755 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.755 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.755 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.755 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.756 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.756 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.756 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.756 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.756 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.757 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.757 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.757 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.757 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.757 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.758 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.759 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.760 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.760 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.760 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.760 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.760 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.761 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.761 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.761 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.761 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.761 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.762 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.762 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.762 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.762 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.762 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.763 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.763 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.763 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.763 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.763 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.764 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.764 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.764 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.764 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.764 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.765 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.765 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.765 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.765 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.766 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.766 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.766 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.766 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.767 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.767 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.767 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.767 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.767 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.768 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.768 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.768 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.768 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.769 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.770 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.770 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.770 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.770 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.771 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.771 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.771 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.771 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.771 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.772 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.772 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.772 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.772 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.772 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.773 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.773 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.773 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.773 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.773 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.774 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.775 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.775 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.775 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.775 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.775 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.776 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.777 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.777 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.777 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.777 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.777 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.778 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.779 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.780 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.780 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.780 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.780 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.780 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.781 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.782 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.782 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.782 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.782 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.782 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.783 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.783 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.783 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.783 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.783 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.784 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.784 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.784 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.784 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.784 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.785 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.786 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.787 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.788 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.788 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.788 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.788 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.788 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.789 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.790 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.791 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.792 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.792 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.793 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.793 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.793 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.793 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.793 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.794 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.795 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.796 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.796 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.796 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.796 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.796 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.797 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.798 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.799 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.800 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.801 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.802 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.803 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.804 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.805 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.806 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.807 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.808 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.809 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.810 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.811 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.811 2 DEBUG oslo_service.backend._eventlet.service [None req-9df6f568-698c-48ce-968e-de9fa5908fa5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 17:59:14 compute-1 nova_compute[238822]: 2025-09-30 17:59:14.812 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 17:59:14 compute-1 ceph-mon[75484]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.325 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Sep 30 17:59:15 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 17:59:15 compute-1 systemd[1]: Started libvirt QEMU daemon.
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.414 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f81fe0e81d0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.415 2 WARNING nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.415 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f81fe0e81d0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Sep 30 17:59:15 compute-1 nova_compute[238822]: libvirt:  error : internal error: could not initialize domain event timer
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.417 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.417 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.418 2 INFO nova.utils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] The default thread pool MainProcess.default is initialized
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.418 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.418 2 INFO nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Connection event '1' reason 'None'
Sep 30 17:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:15 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.925 2 WARNING nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Sep 30 17:59:15 compute-1 nova_compute[238822]: 2025-09-30 17:59:15.926 2 DEBUG nova.virt.libvirt.volume.mount [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 17:59:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:16.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:16.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.386 2 INFO nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]: 
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <host>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <uuid>12ce99da-db91-4763-aecd-1e4b4dea5907</uuid>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <arch>x86_64</arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model>EPYC-Rome-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <vendor>AMD</vendor>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <microcode version='16777317'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <signature family='23' model='49' stepping='0'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='x2apic'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='tsc-deadline'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='osxsave'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='hypervisor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='tsc_adjust'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='spec-ctrl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='stibp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='arch-capabilities'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='cmp_legacy'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='topoext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='virt-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='lbrv'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='tsc-scale'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='vmcb-clean'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='pause-filter'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='pfthreshold'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='svme-addr-chk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='rdctl-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='mds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature name='pschange-mc-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <pages unit='KiB' size='4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <pages unit='KiB' size='2048'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <pages unit='KiB' size='1048576'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <power_management>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <suspend_mem/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </power_management>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <iommu support='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <migration_features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <live/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <uri_transports>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <uri_transport>tcp</uri_transport>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <uri_transport>rdma</uri_transport>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </uri_transports>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </migration_features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <topology>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <cells num='1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <cell id='0'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <memory unit='KiB'>7864116</memory>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <pages unit='KiB' size='4'>1966029</pages>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <distances>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <sibling id='0' value='10'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           </distances>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           <cpus num='8'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:           </cpus>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         </cell>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </cells>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </topology>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <cache>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </cache>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <secmodel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model>selinux</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <doi>0</doi>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </secmodel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <secmodel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model>dac</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <doi>0</doi>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </secmodel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </host>
Sep 30 17:59:16 compute-1 nova_compute[238822]: 
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <guest>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <os_type>hvm</os_type>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <arch name='i686'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <wordsize>32</wordsize>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <domain type='qemu'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <domain type='kvm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <pae/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <nonpae/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <acpi default='on' toggle='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <apic default='on' toggle='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <cpuselection/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <deviceboot/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <disksnapshot default='on' toggle='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <externalSnapshot/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </guest>
Sep 30 17:59:16 compute-1 nova_compute[238822]: 
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <guest>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <os_type>hvm</os_type>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <arch name='x86_64'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <wordsize>64</wordsize>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <domain type='qemu'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <domain type='kvm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <acpi default='on' toggle='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <apic default='on' toggle='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <cpuselection/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <deviceboot/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <disksnapshot default='on' toggle='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <externalSnapshot/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </guest>
Sep 30 17:59:16 compute-1 nova_compute[238822]: 
Sep 30 17:59:16 compute-1 nova_compute[238822]: </capabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]: 
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.397 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.438 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 17:59:16 compute-1 nova_compute[238822]: <domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <domain>kvm</domain>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <arch>i686</arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <vcpu max='4096'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <iothreads supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <os supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='firmware'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <loader supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>rom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pflash</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='readonly'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>yes</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='secure'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </loader>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </os>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-passthrough' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='hostPassthroughMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='maximum' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='maximumMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-model' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <vendor>AMD</vendor>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='x2apic'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='hypervisor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='stibp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='overflow-recov'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='succor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lbrv'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-scale'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='flushbyasid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pause-filter'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pfthreshold'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rdctl-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='mds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='gds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rfds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='disable' name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='custom' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Dhyana-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-128'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-256'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-512'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 17:59:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:16 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v6'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v7'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <memoryBacking supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='sourceType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>file</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>anonymous</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>memfd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </memoryBacking>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <disk supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='diskDevice'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>disk</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cdrom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>floppy</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>lun</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>fdc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>sata</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <graphics supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vnc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egl-headless</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>dbus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </graphics>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <video supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='modelType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vga</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cirrus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>none</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>bochs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ramfb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </video>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hostdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='mode'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>subsystem</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='startupPolicy'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>mandatory</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>requisite</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>optional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='subsysType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pci</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='capsType'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='pciBackend'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hostdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <rng supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>random</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </rng>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <filesystem supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='driverType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>path</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>handle</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtiofs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </filesystem>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <tpm supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-tis</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-crb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emulator</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>external</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendVersion'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>2.0</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </tpm>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <redirdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </redirdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <channel supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pty</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>unix</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </channel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <crypto supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>qemu</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </crypto>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <interface supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>passt</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </interface>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <panic supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>isa</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>hyperv</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </panic>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <gic supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <vmcoreinfo supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <genid supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backingStoreInput supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backup supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <async-teardown supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <ps2 supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sev supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sgx supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hyperv supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='features'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>relaxed</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vapic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>spinlocks</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vpindex</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>runtime</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>synic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>stimer</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reset</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vendor_id</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>frequencies</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reenlightenment</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tlbflush</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ipi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>avic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emsr_bitmap</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>xmm_input</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hyperv>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <launchSecurity supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]: </domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.454 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 17:59:16 compute-1 nova_compute[238822]: <domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <domain>kvm</domain>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <arch>i686</arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <vcpu max='240'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <iothreads supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <os supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='firmware'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <loader supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>rom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pflash</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='readonly'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>yes</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='secure'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </loader>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </os>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-passthrough' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='hostPassthroughMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='maximum' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='maximumMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-model' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <vendor>AMD</vendor>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='x2apic'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='hypervisor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='stibp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='overflow-recov'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='succor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lbrv'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-scale'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='flushbyasid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pause-filter'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pfthreshold'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rdctl-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='mds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='gds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rfds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='disable' name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='custom' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Dhyana-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-128'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-256'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-512'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v6'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v7'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <memoryBacking supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='sourceType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>file</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>anonymous</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>memfd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </memoryBacking>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <disk supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='diskDevice'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>disk</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cdrom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>floppy</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>lun</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ide</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>fdc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>sata</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <graphics supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vnc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egl-headless</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>dbus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </graphics>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <video supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='modelType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vga</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cirrus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>none</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>bochs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ramfb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </video>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hostdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='mode'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>subsystem</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='startupPolicy'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>mandatory</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>requisite</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>optional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='subsysType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pci</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='capsType'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='pciBackend'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hostdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <rng supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>random</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </rng>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <filesystem supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='driverType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>path</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>handle</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtiofs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </filesystem>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <tpm supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-tis</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-crb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emulator</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>external</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendVersion'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>2.0</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </tpm>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <redirdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </redirdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <channel supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pty</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>unix</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </channel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <crypto supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>qemu</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </crypto>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <interface supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>passt</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </interface>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <panic supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>isa</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>hyperv</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </panic>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <gic supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <vmcoreinfo supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <genid supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backingStoreInput supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backup supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <async-teardown supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <ps2 supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sev supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sgx supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hyperv supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='features'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>relaxed</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vapic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>spinlocks</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vpindex</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>runtime</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>synic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>stimer</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reset</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vendor_id</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>frequencies</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reenlightenment</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tlbflush</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ipi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>avic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emsr_bitmap</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>xmm_input</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hyperv>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <launchSecurity supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]: </domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.506 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.512 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 17:59:16 compute-1 nova_compute[238822]: <domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <domain>kvm</domain>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <arch>x86_64</arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <vcpu max='4096'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <iothreads supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <os supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='firmware'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>efi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <loader supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>rom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pflash</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='readonly'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>yes</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='secure'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>yes</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </loader>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </os>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-passthrough' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='hostPassthroughMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='maximum' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='maximumMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-model' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <vendor>AMD</vendor>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='x2apic'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='hypervisor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='stibp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='overflow-recov'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='succor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lbrv'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-scale'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='flushbyasid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pause-filter'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pfthreshold'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rdctl-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='mds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='gds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rfds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='disable' name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='custom' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Dhyana-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-128'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-256'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-512'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v6'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v7'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <memoryBacking supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='sourceType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>file</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>anonymous</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>memfd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </memoryBacking>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <disk supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='diskDevice'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>disk</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cdrom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>floppy</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>lun</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>fdc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>sata</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <graphics supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vnc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egl-headless</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>dbus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </graphics>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <video supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='modelType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vga</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cirrus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>none</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>bochs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ramfb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </video>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hostdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='mode'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>subsystem</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='startupPolicy'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>mandatory</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>requisite</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>optional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='subsysType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pci</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='capsType'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='pciBackend'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hostdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <rng supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>random</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </rng>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <filesystem supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='driverType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>path</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>handle</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtiofs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </filesystem>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <tpm supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-tis</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-crb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emulator</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>external</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendVersion'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>2.0</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </tpm>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <redirdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </redirdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <channel supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pty</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>unix</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </channel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <crypto supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>qemu</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </crypto>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <interface supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>passt</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </interface>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <panic supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>isa</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>hyperv</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </panic>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <gic supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <vmcoreinfo supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <genid supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backingStoreInput supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backup supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <async-teardown supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <ps2 supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sev supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sgx supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hyperv supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='features'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>relaxed</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vapic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>spinlocks</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vpindex</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>runtime</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>synic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>stimer</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reset</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vendor_id</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>frequencies</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reenlightenment</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tlbflush</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ipi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>avic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emsr_bitmap</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>xmm_input</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hyperv>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <launchSecurity supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]: </domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.568 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 17:59:16 compute-1 nova_compute[238822]: <domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <domain>kvm</domain>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <arch>x86_64</arch>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <vcpu max='240'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <iothreads supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <os supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='firmware'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <loader supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>rom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pflash</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='readonly'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>yes</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='secure'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>no</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </loader>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </os>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-passthrough' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='hostPassthroughMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='maximum' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='maximumMigratable'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>on</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>off</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='host-model' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <vendor>AMD</vendor>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='x2apic'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='hypervisor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='stibp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='overflow-recov'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='succor'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lbrv'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='tsc-scale'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='flushbyasid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pause-filter'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pfthreshold'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rdctl-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='mds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='gds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='require' name='rfds-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <feature policy='disable' name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <mode name='custom' supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Broadwell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Cooperlake-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Denverton-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Dhyana-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='auto-ibrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Milan-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amd-psfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='no-nested-data-bp'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='null-sel-clr-base'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='stibp-always-on'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-Rome-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='EPYC-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='GraniteRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-128'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-256'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx10-512'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='prefetchiti'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Haswell-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v6'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Icelake-Server-v7'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='IvyBridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='KnightsMill-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4fmaps'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-4vnniw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512er'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512pf'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G4-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Opteron_G5-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fma4'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tbm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xop'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SapphireRapids-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='amx-tile'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-bf16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-fp16'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512-vpopcntdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bitalg'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vbmi2'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrc'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fzrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='la57'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='taa-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='tsx-ldtrk'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xfd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='SierraForest-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ifma'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-ne-convert'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx-vnni-int8'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='bus-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cmpccxadd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fbsdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='fsrs'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ibrs-all'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mcdt-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pbrsb-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='psdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='sbdr-ssdp-no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='serialize'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vaes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='vpclmulqdq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 systemd[1]: Started libvirt nodedev daemon.
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Client-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='hle'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='rtm'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Skylake-Server-v5'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512bw'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512cd'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512dq'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512f'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='avx512vl'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='invpcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pcid'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='pku'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='mpx'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v2'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v3'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='core-capability'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='split-lock-detect'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='Snowridge-v4'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='cldemote'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='erms'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='gfni'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdir64b'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='movdiri'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='xsaves'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='athlon-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='core2duo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='coreduo-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='n270-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='ss'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <blockers model='phenom-v1'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnow'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <feature name='3dnowext'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </blockers>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </mode>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </cpu>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <memoryBacking supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <enum name='sourceType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>file</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>anonymous</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <value>memfd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </memoryBacking>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <disk supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='diskDevice'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>disk</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cdrom</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>floppy</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>lun</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ide</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>fdc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>sata</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <graphics supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vnc</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egl-headless</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>dbus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </graphics>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <video supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='modelType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vga</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>cirrus</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>none</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>bochs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ramfb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </video>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hostdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='mode'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>subsystem</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='startupPolicy'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>mandatory</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>requisite</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>optional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='subsysType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pci</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>scsi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='capsType'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='pciBackend'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hostdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <rng supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtio-non-transitional</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>random</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>egd</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </rng>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <filesystem supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='driverType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>path</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>handle</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>virtiofs</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </filesystem>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <tpm supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-tis</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tpm-crb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emulator</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>external</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendVersion'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>2.0</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </tpm>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <redirdev supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='bus'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>usb</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </redirdev>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <channel supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>pty</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>unix</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </channel>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <crypto supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='type'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>qemu</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendModel'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>builtin</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </crypto>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <interface supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='backendType'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>default</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>passt</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </interface>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <panic supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='model'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>isa</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>hyperv</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </panic>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </devices>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   <features>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <gic supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <vmcoreinfo supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <genid supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backingStoreInput supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <backup supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <async-teardown supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <ps2 supported='yes'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sev supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <sgx supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <hyperv supported='yes'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       <enum name='features'>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>relaxed</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vapic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>spinlocks</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vpindex</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>runtime</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>synic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>stimer</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reset</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>vendor_id</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>frequencies</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>reenlightenment</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>tlbflush</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>ipi</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>avic</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>emsr_bitmap</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:         <value>xmm_input</value>
Sep 30 17:59:16 compute-1 nova_compute[238822]:       </enum>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     </hyperv>
Sep 30 17:59:16 compute-1 nova_compute[238822]:     <launchSecurity supported='no'/>
Sep 30 17:59:16 compute-1 nova_compute[238822]:   </features>
Sep 30 17:59:16 compute-1 nova_compute[238822]: </domainCapabilities>
Sep 30 17:59:16 compute-1 nova_compute[238822]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.617 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.618 2 INFO nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Secure Boot support detected
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.625 2 INFO nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.625 2 INFO nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 17:59:16 compute-1 nova_compute[238822]: 2025-09-30 17:59:16.775 2 DEBUG nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Sep 30 17:59:16 compute-1 ceph-mon[75484]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:17 compute-1 nova_compute[238822]: 2025-09-30 17:59:17.287 2 INFO nova.virt.node [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Determined node identity 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from /var/lib/nova/compute_id
Sep 30 17:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:17 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:17 compute-1 nova_compute[238822]: 2025-09-30 17:59:17.797 2 WARNING nova.compute.manager [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Compute nodes ['3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Sep 30 17:59:17 compute-1 sshd-session[239192]: Accepted publickey for zuul from 192.168.122.30 port 46296 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 17:59:17 compute-1 systemd-logind[789]: New session 57 of user zuul.
Sep 30 17:59:18 compute-1 systemd[1]: Started Session 57 of User zuul.
Sep 30 17:59:18 compute-1 sshd-session[239192]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 17:59:18 compute-1 sudo[239196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 17:59:18 compute-1 sudo[239196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:18 compute-1 sudo[239196]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:18.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:18.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:18 compute-1 sudo[239245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 17:59:18 compute-1 sudo[239245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:18 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:18 compute-1 nova_compute[238822]: 2025-09-30 17:59:18.811 2 INFO nova.compute.manager [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 17:59:18 compute-1 sudo[239245]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:18 compute-1 ceph-mon[75484]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 17:59:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 17:59:19 compute-1 python3.9[239428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 17:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35dc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:19 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.831 2 WARNING nova.compute.manager [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.832 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.832 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.832 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.833 2 DEBUG nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 17:59:19 compute-1 nova_compute[238822]: 2025-09-30 17:59:19.833 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 17:59:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2052937154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:19 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Sep 30 17:59:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:19.981155) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 17:59:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Sep 30 17:59:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255159981236, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1882, "num_deletes": 251, "total_data_size": 4818675, "memory_usage": 4874944, "flush_reason": "Manual Compaction"}
Sep 30 17:59:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255160002895, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3132539, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18998, "largest_seqno": 20875, "table_properties": {"data_size": 3124865, "index_size": 4617, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15708, "raw_average_key_size": 19, "raw_value_size": 3109490, "raw_average_value_size": 3931, "num_data_blocks": 207, "num_entries": 791, "num_filter_entries": 791, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759254986, "oldest_key_time": 1759254986, "file_creation_time": 1759255159, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 21806 microseconds, and 11271 cpu microseconds.
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.002960) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3132539 bytes OK
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.002987) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.005289) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.005311) EVENT_LOG_v1 {"time_micros": 1759255160005304, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.005340) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4810182, prev total WAL file size 4810182, number of live WAL files 2.
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.007365) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3059KB)], [36(10MB)]
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255160007430, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 13663290, "oldest_snapshot_seqno": -1}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4773 keys, 11580810 bytes, temperature: kUnknown
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255160091223, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 11580810, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11547705, "index_size": 20067, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11973, "raw_key_size": 121204, "raw_average_key_size": 25, "raw_value_size": 11459817, "raw_average_value_size": 2400, "num_data_blocks": 831, "num_entries": 4773, "num_filter_entries": 4773, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255160, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.091567) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 11580810 bytes
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.093707) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.8 rd, 138.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 10.0 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(8.1) write-amplify(3.7) OK, records in: 5291, records dropped: 518 output_compression: NoCompression
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.093738) EVENT_LOG_v1 {"time_micros": 1759255160093724, "job": 20, "event": "compaction_finished", "compaction_time_micros": 83909, "compaction_time_cpu_micros": 38120, "output_level": 6, "num_output_files": 1, "total_output_size": 11580810, "num_input_records": 5291, "num_output_records": 4773, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255160094882, "job": 20, "event": "table_file_deletion", "file_number": 38}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255160098910, "job": 20, "event": "table_file_deletion", "file_number": 36}
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.007267) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.099008) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.099016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.099019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.099022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-17:59:20.099025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 17:59:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:20.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:20.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 17:59:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3542600850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.292 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 17:59:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:20 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.516 2 WARNING nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.519 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.546 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.547 2 DEBUG nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5176MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.548 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:59:20 compute-1 nova_compute[238822]: 2025-09-30 17:59:20.548 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:59:20 compute-1 sudo[239606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsowcxlzrxnhauvknbrfzhphyzlvuakl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255160.0016124-53-264868171746238/AnsiballZ_systemd_service.py'
Sep 30 17:59:20 compute-1 sudo[239606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:20 compute-1 python3.9[239608]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:59:20 compute-1 systemd[1]: Reloading.
Sep 30 17:59:21 compute-1 ceph-mon[75484]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:59:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3542600850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:21 compute-1 nova_compute[238822]: 2025-09-30 17:59:21.056 2 WARNING nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] No compute node record for compute-1.ctlplane.example.com:3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a could not be found.
Sep 30 17:59:21 compute-1 systemd-rc-local-generator[239635]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:59:21 compute-1 systemd-sysv-generator[239639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:21 compute-1 sudo[239606]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:21 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:21 compute-1 nova_compute[238822]: 2025-09-30 17:59:21.565 2 INFO nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a
Sep 30 17:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:22 compute-1 ceph-mon[75484]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:22.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:22.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:22 compute-1 python3.9[239797]: ansible-ansible.builtin.service_facts Invoked
Sep 30 17:59:22 compute-1 network[239814]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 17:59:22 compute-1 network[239815]: 'network-scripts' will be removed from distribution in near future.
Sep 30 17:59:22 compute-1 network[239816]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 17:59:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:22 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:23 compute-1 nova_compute[238822]: 2025-09-30 17:59:23.095 2 DEBUG nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 17:59:23 compute-1 nova_compute[238822]: 2025-09-30 17:59:23.095 2 DEBUG nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 17:59:20 up  3:36,  0 user,  load average: 1.14, 1.47, 1.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 17:59:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:23 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:23 compute-1 podman[239826]: 2025-09-30 17:59:23.511014071 +0000 UTC m=+0.174230653 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 17:59:23 compute-1 nova_compute[238822]: 2025-09-30 17:59:23.543 2 INFO nova.scheduler.client.report [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] [req-84267a83-b1e6-4e21-919c-facc1578a1e2] Created resource provider record via placement API for resource provider with UUID 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a and name compute-1.ctlplane.example.com.
Sep 30 17:59:23 compute-1 nova_compute[238822]: 2025-09-30 17:59:23.572 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 17:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 17:59:23 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3694196857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.018 2 DEBUG oslo_concurrency.processutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.027 2 DEBUG nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 17:59:24 compute-1 nova_compute[238822]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.027 2 INFO nova.virt.libvirt.host [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] kernel doesn't support AMD SEV
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.028 2 DEBUG nova.compute.provider_tree [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 39, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.029 2 DEBUG nova.virt.libvirt.driver [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 17:59:24 compute-1 ceph-mon[75484]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3052332124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3694196857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 17:59:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:24.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:24.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:24 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.576 2 DEBUG nova.scheduler.client.report [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Updated inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 39, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.576 2 DEBUG nova.compute.provider_tree [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.577 2 DEBUG nova.compute.provider_tree [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 17:59:24 compute-1 nova_compute[238822]: 2025-09-30 17:59:24.708 2 DEBUG nova.compute.provider_tree [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 17:59:24 compute-1 unix_chkpwd[239928]: password check failed for user (root)
Sep 30 17:59:24 compute-1 sshd-session[239823]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:59:25 compute-1 sudo[239940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 17:59:25 compute-1 sudo[239940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:25 compute-1 sudo[239940]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:25 compute-1 nova_compute[238822]: 2025-09-30 17:59:25.223 2 DEBUG nova.compute.resource_tracker [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 17:59:25 compute-1 nova_compute[238822]: 2025-09-30 17:59:25.223 2 DEBUG oslo_concurrency.lockutils [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.675s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:59:25 compute-1 nova_compute[238822]: 2025-09-30 17:59:25.223 2 DEBUG nova.service [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Sep 30 17:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:25 compute-1 sudo[239967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:59:25 compute-1 sudo[239967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:25 compute-1 sudo[239967]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:25 compute-1 nova_compute[238822]: 2025-09-30 17:59:25.336 2 DEBUG nova.service [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Sep 30 17:59:25 compute-1 nova_compute[238822]: 2025-09-30 17:59:25.337 2 DEBUG nova.servicegroup.drivers.db [None req-ae2092ba-45a1-4d19-8d1e-996f41dcc5e1 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 17:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:25 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:59:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 17:59:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:26.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:26.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:26 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:26 compute-1 sshd-session[239823]: Failed password for root from 192.210.160.141 port 60080 ssh2
Sep 30 17:59:27 compute-1 ceph-mon[75484]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:27 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:27 compute-1 sshd-session[239823]: Connection closed by authenticating user root 192.210.160.141 port 60080 [preauth]
Sep 30 17:59:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:28.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:28.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:28 compute-1 sudo[240196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mknelhvsdjyqkjvdugadtkitbgyyqozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255167.7930236-91-199283256148986/AnsiballZ_systemd_service.py'
Sep 30 17:59:28 compute-1 sudo[240196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:28 compute-1 python3.9[240198]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 17:59:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:28 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:28 compute-1 sudo[240196]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:29 compute-1 ceph-mon[75484]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:29 compute-1 sudo[240350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hazgmbimndrxpkuxbknncuxizqhileyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255168.8287487-111-250058456019654/AnsiballZ_file.py'
Sep 30 17:59:29 compute-1 sudo[240350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:29 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:29 compute-1 python3.9[240352]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:29 compute-1 sudo[240350]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:29 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 17:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:30 compute-1 ceph-mon[75484]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 340 B/s rd, 0 op/s
Sep 30 17:59:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:30.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:59:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:30.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:59:30 compute-1 sudo[240504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifgqzwcluzkpekhakfaaqrzpuogbjmvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255169.7515254-127-68762303532604/AnsiballZ_file.py'
Sep 30 17:59:30 compute-1 sudo[240504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:30 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:30 compute-1 python3.9[240506]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:30 compute-1 sudo[240504]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:31 compute-1 sudo[240668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goctudfairvzttmktmnlejqxhfwvgats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255170.793404-145-188622221540496/AnsiballZ_command.py'
Sep 30 17:59:31 compute-1 sudo[240668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:31 compute-1 podman[240631]: 2025-09-30 17:59:31.348373404 +0000 UTC m=+0.083276639 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 17:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:31 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:31 compute-1 python3.9[240674]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:59:31 compute-1 sudo[240668]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4001f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:32.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:32.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:32 compute-1 nova_compute[238822]: 2025-09-30 17:59:32.341 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 17:59:32 compute-1 python3.9[240829]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 17:59:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:32 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:32 compute-1 ceph-mon[75484]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:32 compute-1 nova_compute[238822]: 2025-09-30 17:59:32.856 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 17:59:33 compute-1 sudo[240980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixuepghepokgidtuxafgfpaifnewtnne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255172.74464-181-29413060833822/AnsiballZ_systemd_service.py'
Sep 30 17:59:33 compute-1 sudo[240980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:33 compute-1 python3.9[240982]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 17:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:33 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:33 compute-1 systemd[1]: Reloading.
Sep 30 17:59:33 compute-1 systemd-sysv-generator[241015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 17:59:33 compute-1 systemd-rc-local-generator[241011]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 17:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:33 compute-1 sudo[240980]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:34.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:34.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:34 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:34 compute-1 sudo[241169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbrjtxhwqvxdbfqnnutyiloidmmefqxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255174.1274374-197-3141436905255/AnsiballZ_command.py'
Sep 30 17:59:34 compute-1 sudo[241169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:34 compute-1 python3.9[241171]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 17:59:34 compute-1 sudo[241169]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:34 compute-1 ceph-mon[75484]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:35 compute-1 sudo[241323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfvsdgpwxrfeynyeckpvrwwkjngrdkxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255174.9885259-215-262650666181807/AnsiballZ_file.py'
Sep 30 17:59:35 compute-1 sudo[241323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:35 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:35 compute-1 python3.9[241325]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:59:35 compute-1 sudo[241323]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4001f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:36.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:36.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:36 compute-1 python3.9[241478]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:36 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:36 compute-1 sshd-session[241350]: Invalid user cristian from 194.107.115.65 port 12528
Sep 30 17:59:36 compute-1 sshd-session[241350]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:36 compute-1 sshd-session[241350]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 17:59:36 compute-1 ceph-mon[75484]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:37 compute-1 python3.9[241631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:37 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:37 compute-1 python3.9[241752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255176.6552346-247-2070489500326/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 17:59:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:59:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:38.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:38.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:38 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:38 compute-1 podman[241853]: 2025-09-30 17:59:38.527789775 +0000 UTC m=+0.070838835 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 17:59:38 compute-1 sshd-session[241350]: Failed password for invalid user cristian from 194.107.115.65 port 12528 ssh2
Sep 30 17:59:38 compute-1 sudo[241924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcptrqivomdtolgnreedtpegjtxgosdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255178.1574254-277-196262095273919/AnsiballZ_group.py'
Sep 30 17:59:38 compute-1 sudo[241924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:38 compute-1 python3.9[241927]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Sep 30 17:59:38 compute-1 sudo[241924]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:39 compute-1 ceph-mon[75484]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:39 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4001f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:39 compute-1 sudo[242077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkahyboighuneywbtmptrukmehksifl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255179.2370439-299-39377273051572/AnsiballZ_getent.py'
Sep 30 17:59:39 compute-1 sudo[242077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:39 compute-1 python3.9[242079]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Sep 30 17:59:39 compute-1 sudo[242077]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:40.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5ef905d0 =====
Sep 30 17:59:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5ef905d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:40 compute-1 radosgw[84864]: beast: 0x7fda5ef905d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:40.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:40 compute-1 sshd-session[241350]: Received disconnect from 194.107.115.65 port 12528:11: Bye Bye [preauth]
Sep 30 17:59:40 compute-1 sshd-session[241350]: Disconnected from invalid user cristian 194.107.115.65 port 12528 [preauth]
Sep 30 17:59:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:40 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:40 compute-1 sudo[242231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usjwqvyilccoucrkrdivyhotbzztcpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255180.1848276-315-101606637816425/AnsiballZ_group.py'
Sep 30 17:59:40 compute-1 sudo[242231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:40 compute-1 python3.9[242233]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 17:59:40 compute-1 sshd[170789]: Timeout before authentication for connection from 14.103.129.43 to 38.102.83.102, pid = 225444
Sep 30 17:59:40 compute-1 groupadd[242236]: group added to /etc/group: name=ceilometer, GID=42405
Sep 30 17:59:40 compute-1 groupadd[242236]: group added to /etc/gshadow: name=ceilometer
Sep 30 17:59:40 compute-1 groupadd[242236]: new group: name=ceilometer, GID=42405
Sep 30 17:59:40 compute-1 sudo[242231]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:40 compute-1 podman[242235]: 2025-09-30 17:59:40.918239287 +0000 UTC m=+0.093454942 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd)
Sep 30 17:59:41 compute-1 ceph-mon[75484]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:41 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:41 compute-1 sudo[242410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmiozlzdwfagyslhbclppebnbzgdlgqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255181.1105072-331-244272067686755/AnsiballZ_user.py'
Sep 30 17:59:41 compute-1 sudo[242410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 17:59:41 compute-1 python3.9[242412]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 17:59:42 compute-1 useradd[242415]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 17:59:42 compute-1 useradd[242415]: add 'ceilometer' to group 'libvirt'
Sep 30 17:59:42 compute-1 useradd[242415]: add 'ceilometer' to shadow group 'libvirt'
Sep 30 17:59:42 compute-1 ceph-mon[75484]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:42 compute-1 sudo[242410]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:42.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:42.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:42 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:43 compute-1 sshd-session[242446]: Invalid user admin from 84.51.43.58 port 35183
Sep 30 17:59:43 compute-1 sshd-session[242446]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:43 compute-1 sshd-session[242446]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 17:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:43 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:43 compute-1 python3.9[242574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4001f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:44 compute-1 python3.9[242695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759255183.0425515-383-209880030649728/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:44.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:44.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:44 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:44 compute-1 sshd-session[242773]: Invalid user user5 from 107.172.146.104 port 36026
Sep 30 17:59:44 compute-1 sshd-session[242773]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:44 compute-1 sshd-session[242773]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 17:59:44 compute-1 sshd-session[242446]: Failed password for invalid user admin from 84.51.43.58 port 35183 ssh2
Sep 30 17:59:44 compute-1 ceph-mon[75484]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:44 compute-1 python3.9[242849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:45 compute-1 sudo[242971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 17:59:45 compute-1 sudo[242971]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 17:59:45 compute-1 sudo[242971]: pam_unix(sudo:session): session closed for user root
Sep 30 17:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:45 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:45 compute-1 python3.9[242970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759255184.3335016-383-162565263467147/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:46.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:46.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:46 compute-1 python3.9[243147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:46 compute-1 sshd-session[242446]: Received disconnect from 84.51.43.58 port 35183:11: Bye Bye [preauth]
Sep 30 17:59:46 compute-1 sshd-session[242446]: Disconnected from invalid user admin 84.51.43.58 port 35183 [preauth]
Sep 30 17:59:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:46 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:46 compute-1 sshd-session[242773]: Failed password for invalid user user5 from 107.172.146.104 port 36026 ssh2
Sep 30 17:59:46 compute-1 ceph-mon[75484]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:46 compute-1 python3.9[243270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759255185.667594-383-25242551125871/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:47 compute-1 sshd-session[242773]: Received disconnect from 107.172.146.104 port 36026:11: Bye Bye [preauth]
Sep 30 17:59:47 compute-1 sshd-session[242773]: Disconnected from invalid user user5 107.172.146.104 port 36026 [preauth]
Sep 30 17:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:47 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:47 compute-1 unix_chkpwd[243420]: password check failed for user (root)
Sep 30 17:59:47 compute-1 sshd-session[243095]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 17:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:47 compute-1 python3.9[243421]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 17:59:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:48.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 17:59:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:48.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:48 compute-1 python3.9[243574]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 17:59:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:48 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:48 compute-1 ceph-mon[75484]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:49 compute-1 python3.9[243727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:49 compute-1 sshd-session[243095]: Failed password for root from 192.210.160.141 port 35568 ssh2
Sep 30 17:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:49 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:49 compute-1 python3.9[243848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255188.662641-501-236403320200607/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:50.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:50.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:50 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:50 compute-1 sshd-session[243095]: Connection closed by authenticating user root 192.210.160.141 port 35568 [preauth]
Sep 30 17:59:50 compute-1 python3.9[244000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:50 compute-1 ceph-mon[75484]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 17:59:51 compute-1 python3.9[244077]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:51 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:51 compute-1 python3.9[244227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:52.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:52.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:52 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:52 compute-1 python3.9[244349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255191.3221006-501-120808591609800/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=f3aeda92b1de7a4881364150abf82a5da4c708e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:52 compute-1 ceph-mon[75484]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 17:59:53 compute-1 python3.9[244501]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:53 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:53 compute-1 sshd-session[243874]: Invalid user work from 113.249.93.94 port 2390
Sep 30 17:59:53 compute-1 sshd-session[243874]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:53 compute-1 sshd-session[243874]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94
Sep 30 17:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:53 compute-1 podman[244614]: 2025-09-30 17:59:53.758473918 +0000 UTC m=+0.187451028 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 17:59:53 compute-1 python3.9[244625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255192.6526198-501-41074494101929/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:54.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 17:59:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:54.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 17:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:59:54.304 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 17:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:59:54.304 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 17:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 17:59:54.304 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 17:59:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:54 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:54 compute-1 python3.9[244802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:54 compute-1 ceph-mon[75484]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:55 compute-1 python3.9[244924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255193.9743614-501-223359091987432/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0001d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:55 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:55 compute-1 sshd-session[243874]: Failed password for invalid user work from 113.249.93.94 port 2390 ssh2
Sep 30 17:59:55 compute-1 python3.9[245074]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:56.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:56.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:56 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:56 compute-1 python3.9[245196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255195.3845766-501-105929545325143/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:57 compute-1 ceph-mon[75484]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:57 compute-1 python3.9[245347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:57 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:57 compute-1 sshd-session[243874]: Received disconnect from 113.249.93.94 port 2390:11: Bye Bye [preauth]
Sep 30 17:59:57 compute-1 sshd-session[243874]: Disconnected from invalid user work 113.249.93.94 port 2390 [preauth]
Sep 30 17:59:58 compute-1 python3.9[245470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255196.7475998-501-185123066386595/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:17:59:58.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 17:59:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 17:59:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:17:59:58.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 17:59:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 17:59:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:58 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:58 compute-1 python3.9[245621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 17:59:58 compute-1 sshd-session[245395]: Invalid user administrator from 14.225.167.110 port 58760
Sep 30 17:59:58 compute-1 sshd-session[245395]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 17:59:58 compute-1 sshd-session[245395]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 17:59:59 compute-1 ceph-mon[75484]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 17:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0001d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 17:59:59 compute-1 python3.9[245743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255198.198657-501-207415009816326/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 17:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 17:59:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 17:59:59 2025: (VI_0) received an invalid passwd!
Sep 30 17:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 17:59:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 18:00:00 compute-1 python3.9[245894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:00.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:00.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:00 compute-1 sshd-session[245395]: Failed password for invalid user administrator from 14.225.167.110 port 58760 ssh2
Sep 30 18:00:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:00 compute-1 python3.9[246015]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255199.627867-501-152988800917381/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:00 compute-1 sshd-session[245395]: Received disconnect from 14.225.167.110 port 58760:11: Bye Bye [preauth]
Sep 30 18:00:00 compute-1 sshd-session[245395]: Disconnected from invalid user administrator 14.225.167.110 port 58760 [preauth]
Sep 30 18:00:01 compute-1 ceph-mon[75484]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:01 compute-1 podman[246167]: 2025-09-30 18:00:01.50754343 +0000 UTC m=+0.056898010 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 18:00:01 compute-1 python3.9[246166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:02 compute-1 ceph-mon[75484]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:02 compute-1 python3.9[246307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255201.0017922-501-35534236479333/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:02.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:02 compute-1 python3.9[246459]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0001d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:03 compute-1 python3.9[246580]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255202.3189657-501-65802983623605/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:04.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:04.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:04 compute-1 python3.9[246731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:04 compute-1 ceph-mon[75484]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:04 compute-1 python3.9[246808]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:05 compute-1 sudo[246947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:00:05 compute-1 sudo[246947]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:05 compute-1 sudo[246947]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:05 compute-1 python3.9[246975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:06 compute-1 python3.9[247059]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:06.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:06.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:06 compute-1 ceph-mon[75484]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:06 compute-1 python3.9[247211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0001f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:07 compute-1 python3.9[247287]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:00:07 compute-1 sudo[247437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcgofrxapsgugnjczknyonfsoxpplcmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255207.605603-879-223522151814364/AnsiballZ_file.py'
Sep 30 18:00:07 compute-1 sudo[247437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:08 compute-1 python3.9[247439]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:08 compute-1 sudo[247437]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:08.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:08.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:08 compute-1 sudo[247612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evhhqroaxkibwpisgudtylnlleagxfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255208.4303124-895-87306267903201/AnsiballZ_file.py'
Sep 30 18:00:08 compute-1 podman[247566]: 2025-09-30 18:00:08.789307572 +0000 UTC m=+0.065172853 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:00:08 compute-1 sudo[247612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:08 compute-1 ceph-mon[75484]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:08 compute-1 python3.9[247615]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:08 compute-1 sudo[247612]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:09 compute-1 sudo[247766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahuegzhsfzkkengcyuvcznkofqfujmoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255209.187839-911-255608646281833/AnsiballZ_file.py'
Sep 30 18:00:09 compute-1 sudo[247766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:09 compute-1 python3.9[247768]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 18:00:09 compute-1 sudo[247766]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:09 compute-1 sshd-session[247564]: Invalid user ftptest from 192.210.160.141 port 35922
Sep 30 18:00:09 compute-1 sshd-session[247564]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:09 compute-1 sshd-session[247564]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:00:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:10.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:10.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:10 compute-1 sudo[247919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnhbdtifasucytdfafixmanfejkgnqeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255210.061443-927-2100383919635/AnsiballZ_systemd_service.py'
Sep 30 18:00:10 compute-1 sudo[247919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:10 compute-1 ceph-mon[75484]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:00:10 compute-1 python3.9[247921]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 18:00:10 compute-1 systemd[1]: Reloading.
Sep 30 18:00:11 compute-1 systemd-rc-local-generator[247952]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 18:00:11 compute-1 systemd-sysv-generator[247956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 18:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:11 compute-1 systemd[1]: Listening on Podman API Socket.
Sep 30 18:00:11 compute-1 sudo[247919]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:11 compute-1 podman[247963]: 2025-09-30 18:00:11.398018329 +0000 UTC m=+0.096764862 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:11 compute-1 sshd-session[247564]: Failed password for invalid user ftptest from 192.210.160.141 port 35922 ssh2
Sep 30 18:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:12 compute-1 sudo[248133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmeniqmpwlhcvzxwqvqattsblcsuexst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255211.6711922-945-68878285645890/AnsiballZ_stat.py'
Sep 30 18:00:12 compute-1 sudo[248133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:12.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:12.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:12 compute-1 python3.9[248135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:12 compute-1 sudo[248133]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:12 compute-1 sudo[248257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kexjocffinswfpavbkihhiaghbmoeerm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255211.6711922-945-68878285645890/AnsiballZ_copy.py'
Sep 30 18:00:12 compute-1 sudo[248257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:12 compute-1 sshd-session[247564]: Connection closed by invalid user ftptest 192.210.160.141 port 35922 [preauth]
Sep 30 18:00:12 compute-1 ceph-mon[75484]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 103 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:12 compute-1 python3.9[248259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255211.6711922-945-68878285645890/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 18:00:12 compute-1 sudo[248257]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.060 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.060 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.060 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.060 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.061 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.061 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.061 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.580 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.580 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:00:13 compute-1 nova_compute[238822]: 2025-09-30 18:00:13.581 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:00:13 compute-1 sudo[248410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jigpzuustoeqoabyfhycftjpbkglnaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255213.2941604-979-114885246115482/AnsiballZ_container_config_data.py'
Sep 30 18:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:13 compute-1 sudo[248410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:14 compute-1 python3.9[248419]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Sep 30 18:00:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:00:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1276608331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:14 compute-1 sudo[248410]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.042 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:00:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:14.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.250 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.252 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:00:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:14.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.273 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.274 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5169MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.274 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:00:14 compute-1 nova_compute[238822]: 2025-09-30 18:00:14.275 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:00:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:14 compute-1 sudo[248586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaoaromoqjdicthdfioonhasneztcash ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255214.2637627-997-213748699061450/AnsiballZ_container_config_hash.py'
Sep 30 18:00:14 compute-1 sudo[248586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:14 compute-1 ceph-mon[75484]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1276608331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3527523402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:14 compute-1 python3.9[248588]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 18:00:14 compute-1 sudo[248586]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:15 compute-1 nova_compute[238822]: 2025-09-30 18:00:15.321 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:00:15 compute-1 nova_compute[238822]: 2025-09-30 18:00:15.322 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:00:14 up  3:37,  0 user,  load average: 1.30, 1.45, 1.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:00:15 compute-1 nova_compute[238822]: 2025-09-30 18:00:15.338 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:00:15 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2491222015' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:15 compute-1 nova_compute[238822]: 2025-09-30 18:00:15.787 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:00:15 compute-1 nova_compute[238822]: 2025-09-30 18:00:15.793 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:00:15 compute-1 sudo[248760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqssaqsmcfkouiqxljpezbjrymnshhjc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255215.3490064-1017-91678783972229/AnsiballZ_edpm_container_manage.py'
Sep 30 18:00:15 compute-1 sudo[248760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2491222015' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:16 compute-1 python3[248762]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 18:00:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:16.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:00:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:16.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:00:16 compute-1 nova_compute[238822]: 2025-09-30 18:00:16.301 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:00:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:16 compute-1 sshd[170789]: drop connection #2 from [14.103.129.43]:46466 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:00:16 compute-1 sshd-session[248764]: Invalid user foundry from 167.172.43.167 port 41308
Sep 30 18:00:16 compute-1 sshd-session[248764]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:16 compute-1 sshd-session[248764]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:00:16 compute-1 nova_compute[238822]: 2025-09-30 18:00:16.811 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:00:16 compute-1 nova_compute[238822]: 2025-09-30 18:00:16.811 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.537s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:00:16 compute-1 ceph-mon[75484]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3651911007' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:18 compute-1 podman[248780]: 2025-09-30 18:00:18.164187334 +0000 UTC m=+1.940504980 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 18:00:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:18.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:18.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:18 compute-1 podman[248882]: 2025-09-30 18:00:18.303394135 +0000 UTC m=+0.049978264 container create fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Sep 30 18:00:18 compute-1 podman[248882]: 2025-09-30 18:00:18.275500316 +0000 UTC m=+0.022084465 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 18:00:18 compute-1 python3[248762]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Sep 30 18:00:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:18 compute-1 sudo[248760]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:18 compute-1 sshd-session[248764]: Failed password for invalid user foundry from 167.172.43.167 port 41308 ssh2
Sep 30 18:00:18 compute-1 ceph-mon[75484]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:19 compute-1 sudo[249073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccvfqqlqjopwhnaxnassoaeolcgkrtwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255218.6592908-1033-249726279856891/AnsiballZ_stat.py'
Sep 30 18:00:19 compute-1 sudo[249073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:19 compute-1 python3.9[249075]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 18:00:19 compute-1 sudo[249073]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:19 compute-1 sshd-session[248945]: Invalid user ss from 175.126.165.170 port 52120
Sep 30 18:00:19 compute-1 sshd-session[248945]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:19 compute-1 sshd-session[248945]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:19 compute-1 sudo[249227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejnwqwxanlabpjhyjfizbpikjiwjwgec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255219.5623932-1051-269921288581819/AnsiballZ_file.py'
Sep 30 18:00:19 compute-1 sudo[249227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:20 compute-1 python3.9[249229]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:20 compute-1 sudo[249227]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:20 compute-1 sshd-session[248764]: Received disconnect from 167.172.43.167 port 41308:11: Bye Bye [preauth]
Sep 30 18:00:20 compute-1 sshd-session[248764]: Disconnected from invalid user foundry 167.172.43.167 port 41308 [preauth]
Sep 30 18:00:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:20.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:20.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:20 compute-1 sudo[249380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsqeevpuzackggmbeybvhzfanecvzvuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255220.2373476-1051-112006037902675/AnsiballZ_copy.py'
Sep 30 18:00:20 compute-1 sudo[249380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:20 compute-1 ceph-mon[75484]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:20 compute-1 python3.9[249382]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759255220.2373476-1051-112006037902675/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:20 compute-1 sudo[249380]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:21 compute-1 sudo[249456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewnctcamyfwfeukbycnqkhpocccecex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255220.2373476-1051-112006037902675/AnsiballZ_systemd.py'
Sep 30 18:00:21 compute-1 sudo[249456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:21 compute-1 python3.9[249458]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 18:00:21 compute-1 systemd[1]: Reloading.
Sep 30 18:00:21 compute-1 systemd-sysv-generator[249489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 18:00:21 compute-1 systemd-rc-local-generator[249483]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 18:00:22 compute-1 sudo[249456]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:22.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:00:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:22.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:00:22 compute-1 sshd-session[248945]: Failed password for invalid user ss from 175.126.165.170 port 52120 ssh2
Sep 30 18:00:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:22 compute-1 sudo[249568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skozlddeoqgxddkpojofutyoerovpscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255220.2373476-1051-112006037902675/AnsiballZ_systemd.py'
Sep 30 18:00:22 compute-1 sudo[249568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:22 compute-1 sshd-session[248945]: Received disconnect from 175.126.165.170 port 52120:11: Bye Bye [preauth]
Sep 30 18:00:22 compute-1 sshd-session[248945]: Disconnected from invalid user ss 175.126.165.170 port 52120 [preauth]
Sep 30 18:00:22 compute-1 python3.9[249570]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 18:00:22 compute-1 systemd[1]: Reloading.
Sep 30 18:00:22 compute-1 ceph-mon[75484]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:00:23 compute-1 systemd-sysv-generator[249605]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 18:00:23 compute-1 systemd-rc-local-generator[249599]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 18:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:23 compute-1 systemd[1]: Starting podman_exporter container...
Sep 30 18:00:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:23 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:00:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7048728cc2ea41af05ac98fd2367e6b8880771b09cd95eb90fb136684f905bf4/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7048728cc2ea41af05ac98fd2367e6b8880771b09cd95eb90fb136684f905bf4/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:23 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.
Sep 30 18:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:23 compute-1 podman[249611]: 2025-09-30 18:00:23.485146681 +0000 UTC m=+0.163084213 container init fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.499Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.499Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.499Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.499Z caller=handler.go:105 level=info collector=container
Sep 30 18:00:23 compute-1 podman[249611]: 2025-09-30 18:00:23.507926454 +0000 UTC m=+0.185863976 container start fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:00:23 compute-1 podman[249611]: podman_exporter
Sep 30 18:00:23 compute-1 systemd[1]: Starting Podman API Service...
Sep 30 18:00:23 compute-1 systemd[1]: Started Podman API Service.
Sep 30 18:00:23 compute-1 systemd[1]: Started podman_exporter container.
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="/usr/bin/podman filtering at log level info"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="Setting parallel job count to 25"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="Using sqlite as database backend"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="Using systemd socket activation to determine API endpoint"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Sep 30 18:00:23 compute-1 sudo[249568]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:23 compute-1 podman[249638]: @ - - [30/Sep/2025:18:00:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 18:00:23 compute-1 podman[249638]: time="2025-09-30T18:00:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:00:23 compute-1 podman[249636]: 2025-09-30 18:00:23.610005717 +0000 UTC m=+0.092496617 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:00:23 compute-1 systemd[1]: fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0-5cab44cb37dd6af5.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 18:00:23 compute-1 systemd[1]: fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0-5cab44cb37dd6af5.service: Failed with result 'exit-code'.
Sep 30 18:00:23 compute-1 podman[249638]: @ - - [30/Sep/2025:18:00:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 35880 "" "Go-http-client/1.1"
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.635Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.635Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 18:00:23 compute-1 podman_exporter[249627]: ts=2025-09-30T18:00:23.636Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 18:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:23 compute-1 sshd[170789]: Timeout before authentication for connection from 14.103.129.43 to 38.102.83.102, pid = 232463
Sep 30 18:00:24 compute-1 podman[249746]: 2025-09-30 18:00:24.003884921 +0000 UTC m=+0.142394656 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:00:24 compute-1 sudo[249850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbktvtdqxmbxogsnqlrvbrdwfolvrlzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255223.7561958-1099-63102954361906/AnsiballZ_systemd.py'
Sep 30 18:00:24 compute-1 sudo[249850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:24.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:24.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:24 compute-1 python3.9[249852]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 18:00:24 compute-1 systemd[1]: Stopping podman_exporter container...
Sep 30 18:00:24 compute-1 podman[249638]: @ - - [30/Sep/2025:18:00:23 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3444 "" "Go-http-client/1.1"
Sep 30 18:00:24 compute-1 systemd[1]: libpod-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.scope: Deactivated successfully.
Sep 30 18:00:24 compute-1 podman[249856]: 2025-09-30 18:00:24.616079744 +0000 UTC m=+0.053129829 container died fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:00:24 compute-1 systemd[1]: fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0-5cab44cb37dd6af5.timer: Deactivated successfully.
Sep 30 18:00:24 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.
Sep 30 18:00:24 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0-userdata-shm.mount: Deactivated successfully.
Sep 30 18:00:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-7048728cc2ea41af05ac98fd2367e6b8880771b09cd95eb90fb136684f905bf4-merged.mount: Deactivated successfully.
Sep 30 18:00:24 compute-1 podman[249856]: 2025-09-30 18:00:24.829925561 +0000 UTC m=+0.266975646 container cleanup fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:00:24 compute-1 podman[249856]: podman_exporter
Sep 30 18:00:24 compute-1 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 18:00:24 compute-1 podman[249885]: podman_exporter
Sep 30 18:00:24 compute-1 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Sep 30 18:00:24 compute-1 systemd[1]: Stopped podman_exporter container.
Sep 30 18:00:24 compute-1 systemd[1]: Starting podman_exporter container...
Sep 30 18:00:24 compute-1 ceph-mon[75484]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 18:00:25 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7048728cc2ea41af05ac98fd2367e6b8880771b09cd95eb90fb136684f905bf4/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7048728cc2ea41af05ac98fd2367e6b8880771b09cd95eb90fb136684f905bf4/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:25 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.
Sep 30 18:00:25 compute-1 podman[249898]: 2025-09-30 18:00:25.119283157 +0000 UTC m=+0.170397871 container init fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.137Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.137Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.138Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.138Z caller=handler.go:105 level=info collector=container
Sep 30 18:00:25 compute-1 podman[249638]: @ - - [30/Sep/2025:18:00:25 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 18:00:25 compute-1 podman[249638]: time="2025-09-30T18:00:25Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:00:25 compute-1 podman[249898]: 2025-09-30 18:00:25.151531464 +0000 UTC m=+0.202646148 container start fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:00:25 compute-1 podman[249898]: podman_exporter
Sep 30 18:00:25 compute-1 systemd[1]: Started podman_exporter container.
Sep 30 18:00:25 compute-1 podman[249638]: @ - - [30/Sep/2025:18:00:25 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 35882 "" "Go-http-client/1.1"
Sep 30 18:00:25 compute-1 sudo[249850]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.225Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.225Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 18:00:25 compute-1 podman_exporter[249913]: ts=2025-09-30T18:00:25.226Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 18:00:25 compute-1 podman[249922]: 2025-09-30 18:00:25.248824508 +0000 UTC m=+0.089416464 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:25 compute-1 sudo[249950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:00:25 compute-1 sudo[249950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:25 compute-1 sudo[249950]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:25 compute-1 sudo[249997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:00:25 compute-1 sudo[249997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:25 compute-1 sudo[250058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:00:25 compute-1 sudo[250058]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:25 compute-1 sudo[250058]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:25 compute-1 sudo[250186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukddyljadmkehjxuuuxsmbnglcbpathz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255225.4178414-1115-123852152369425/AnsiballZ_stat.py'
Sep 30 18:00:25 compute-1 sudo[250186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:25 compute-1 python3.9[250188]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:00:25 compute-1 sudo[250186]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:26 compute-1 sudo[249997]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:26.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:26.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:26 compute-1 sudo[250327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmjyjrkohksscczbspzjjrappmsmkfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255225.4178414-1115-123852152369425/AnsiballZ_copy.py'
Sep 30 18:00:26 compute-1 sudo[250327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:26 compute-1 python3.9[250329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759255225.4178414-1115-123852152369425/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 18:00:26 compute-1 sudo[250327]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:26 compute-1 ceph-mon[75484]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:00:26 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:27 compute-1 sudo[250480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utealxluumrlmbohdijwfwefedvdpawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255226.9397714-1149-107682255992041/AnsiballZ_container_config_data.py'
Sep 30 18:00:27 compute-1 sudo[250480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:27 compute-1 python3.9[250482]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Sep 30 18:00:27 compute-1 sudo[250480]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3604008f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:28 compute-1 sudo[250633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tboohfarqfabtzapyfawaznoujsyfbgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255227.8132915-1167-124464743188479/AnsiballZ_container_config_hash.py'
Sep 30 18:00:28 compute-1 sudo[250633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:28.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:00:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:28.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:00:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:28 compute-1 python3.9[250635]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 18:00:28 compute-1 sudo[250633]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:29 compute-1 ceph-mon[75484]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:29 compute-1 sudo[250786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clgrvvqilicoedyovyxhyjxpgmlxazem ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255228.832629-1187-104288692052794/AnsiballZ_edpm_container_manage.py'
Sep 30 18:00:29 compute-1 sudo[250786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:29 compute-1 python3[250788]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 18:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:30.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:30.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:31 compute-1 ceph-mon[75484]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:00:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:00:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:00:31 compute-1 sudo[250845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:00:31 compute-1 sudo[250845]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:31 compute-1 sudo[250845]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:31 compute-1 podman[250886]: 2025-09-30 18:00:31.918590682 +0000 UTC m=+0.158362336 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:00:32 compute-1 podman[250801]: 2025-09-30 18:00:32.018699573 +0000 UTC m=+2.447338071 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 18:00:32 compute-1 podman[250942]: 2025-09-30 18:00:32.212348207 +0000 UTC m=+0.062847610 container create 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Sep 30 18:00:32 compute-1 podman[250942]: 2025-09-30 18:00:32.181295323 +0000 UTC m=+0.031794716 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 18:00:32 compute-1 python3[250788]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 18:00:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:32.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:32.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:32 compute-1 sudo[250786]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:33 compute-1 ceph-mon[75484]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:33 compute-1 sudo[251133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfglqsnmgfbgrwtzrwxuslseidothwag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255232.767907-1203-280344380209022/AnsiballZ_stat.py'
Sep 30 18:00:33 compute-1 sudo[251133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:33 compute-1 python3.9[251135]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 18:00:33 compute-1 sudo[251133]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:33 compute-1 unix_chkpwd[251208]: password check failed for user (root)
Sep 30 18:00:33 compute-1 sshd-session[250919]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:34 compute-1 sudo[251289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbtixfjuhlaihnknnzfdbnqzicgbastc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255233.6837597-1221-2393124684282/AnsiballZ_file.py'
Sep 30 18:00:34 compute-1 sudo[251289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:34.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:34.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:34 compute-1 python3.9[251291]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:34 compute-1 sudo[251289]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:34 compute-1 sudo[251441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inkcqhccyrilqlumkqhgozgtwwybgvls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255234.4004548-1221-821041033518/AnsiballZ_copy.py'
Sep 30 18:00:34 compute-1 sudo[251441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:35 compute-1 ceph-mon[75484]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:35 compute-1 python3.9[251443]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759255234.4004548-1221-821041033518/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:00:35 compute-1 sudo[251441]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:35 compute-1 sshd-session[250919]: Failed password for root from 192.210.160.141 port 53906 ssh2
Sep 30 18:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:35 compute-1 sudo[251518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqwxojqlgaqbfikqvvdldknbptmsmwts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255234.4004548-1221-821041033518/AnsiballZ_systemd.py'
Sep 30 18:00:35 compute-1 sudo[251518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:35 compute-1 python3.9[251520]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 18:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:35 compute-1 systemd[1]: Reloading.
Sep 30 18:00:35 compute-1 systemd-rc-local-generator[251546]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 18:00:35 compute-1 systemd-sysv-generator[251551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 18:00:36 compute-1 ceph-mon[75484]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:36 compute-1 sudo[251518]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:36.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:36.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:36 compute-1 sudo[251630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trslfdkewqkftmhqzctspkobvjzccryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255234.4004548-1221-821041033518/AnsiballZ_systemd.py'
Sep 30 18:00:36 compute-1 sudo[251630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:36 compute-1 python3.9[251633]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 18:00:36 compute-1 systemd[1]: Reloading.
Sep 30 18:00:36 compute-1 sshd-session[250919]: Connection closed by authenticating user root 192.210.160.141 port 53906 [preauth]
Sep 30 18:00:37 compute-1 systemd-rc-local-generator[251664]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 18:00:37 compute-1 systemd-sysv-generator[251668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 18:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:00:37 compute-1 systemd[1]: Starting openstack_network_exporter container...
Sep 30 18:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:37 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:00:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:37 compute-1 sshd-session[251444]: Invalid user ca from 101.126.25.120 port 51802
Sep 30 18:00:37 compute-1 sshd-session[251444]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:37 compute-1 sshd-session[251444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.126.25.120
Sep 30 18:00:37 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.
Sep 30 18:00:37 compute-1 podman[251674]: 2025-09-30 18:00:37.564101161 +0000 UTC m=+0.197003348 container init 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *bridge.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *coverage.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *datapath.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *iface.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *memory.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *ovnnorthd.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *ovn.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *ovsdbserver.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *pmd_perf.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *pmd_rxq.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: INFO    18:00:37 main.go:48: registering *vswitch.Collector
Sep 30 18:00:37 compute-1 openstack_network_exporter[251690]: NOTICE  18:00:37 main.go:76: listening on https://:9105/metrics
Sep 30 18:00:37 compute-1 podman[251674]: 2025-09-30 18:00:37.604319519 +0000 UTC m=+0.237221716 container start 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:00:37 compute-1 podman[251674]: openstack_network_exporter
Sep 30 18:00:37 compute-1 systemd[1]: Started openstack_network_exporter container.
Sep 30 18:00:37 compute-1 sudo[251630]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:37 compute-1 podman[251700]: 2025-09-30 18:00:37.730987954 +0000 UTC m=+0.109095471 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git)
Sep 30 18:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:37 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d4003a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:38.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:38.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:38 compute-1 sudo[251872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doiyaxdmkxqdnffncgvzoyendhgfejff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255237.9327812-1269-219014944029781/AnsiballZ_systemd.py'
Sep 30 18:00:38 compute-1 ceph-mon[75484]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:38 compute-1 sudo[251872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:38 compute-1 python3.9[251874]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 18:00:38 compute-1 systemd[1]: Stopping openstack_network_exporter container...
Sep 30 18:00:38 compute-1 systemd[1]: libpod-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.scope: Deactivated successfully.
Sep 30 18:00:38 compute-1 podman[251879]: 2025-09-30 18:00:38.764850792 +0000 UTC m=+0.050146397 container died 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64)
Sep 30 18:00:38 compute-1 systemd[1]: 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581-6ca90b01f8eb4648.timer: Deactivated successfully.
Sep 30 18:00:38 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.
Sep 30 18:00:38 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581-userdata-shm.mount: Deactivated successfully.
Sep 30 18:00:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5-merged.mount: Deactivated successfully.
Sep 30 18:00:38 compute-1 podman[251907]: 2025-09-30 18:00:38.918966459 +0000 UTC m=+0.080939209 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:00:38 compute-1 unix_chkpwd[251928]: password check failed for user (root)
Sep 30 18:00:38 compute-1 sshd-session[251895]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:39 compute-1 podman[251879]: 2025-09-30 18:00:39.553451038 +0000 UTC m=+0.838746643 container cleanup 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:00:39 compute-1 podman[251879]: openstack_network_exporter
Sep 30 18:00:39 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 18:00:39 compute-1 podman[251929]: openstack_network_exporter
Sep 30 18:00:39 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Sep 30 18:00:39 compute-1 systemd[1]: Stopped openstack_network_exporter container.
Sep 30 18:00:39 compute-1 systemd[1]: Starting openstack_network_exporter container...
Sep 30 18:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:39 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:39 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:00:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ebe93e81c0b10197f91be46f4fc8c70bd5cd142edfc56d3ac87f4e6dda07fe5/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 18:00:39 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.
Sep 30 18:00:39 compute-1 podman[251942]: 2025-09-30 18:00:39.849823393 +0000 UTC m=+0.158375924 container init 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:00:39 compute-1 sshd-session[251444]: Failed password for invalid user ca from 101.126.25.120 port 51802 ssh2
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *bridge.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *coverage.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *datapath.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *iface.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *memory.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *ovnnorthd.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *ovn.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *ovsdbserver.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *pmd_perf.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *pmd_rxq.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: INFO    18:00:39 main.go:48: registering *vswitch.Collector
Sep 30 18:00:39 compute-1 openstack_network_exporter[251957]: NOTICE  18:00:39 main.go:76: listening on https://:9105/metrics
Sep 30 18:00:39 compute-1 podman[251942]: 2025-09-30 18:00:39.883718419 +0000 UTC m=+0.192270900 container start 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Sep 30 18:00:39 compute-1 podman[251942]: openstack_network_exporter
Sep 30 18:00:39 compute-1 systemd[1]: Started openstack_network_exporter container.
Sep 30 18:00:39 compute-1 sudo[251872]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:39 compute-1 podman[251968]: 2025-09-30 18:00:39.990033654 +0000 UTC m=+0.089665825 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter)
Sep 30 18:00:40 compute-1 sshd-session[251444]: Received disconnect from 101.126.25.120 port 51802:11: Bye Bye [preauth]
Sep 30 18:00:40 compute-1 sshd-session[251444]: Disconnected from invalid user ca 101.126.25.120 port 51802 [preauth]
Sep 30 18:00:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:40.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:40.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:40 compute-1 sudo[252139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drhmdueamiaplkypshplfrhfrwrutksg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255240.1491203-1285-107952660079808/AnsiballZ_find.py'
Sep 30 18:00:40 compute-1 sudo[252139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:00:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:40 compute-1 python3.9[252141]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 18:00:40 compute-1 sudo[252139]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:40 compute-1 ceph-mon[75484]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:00:41 compute-1 sshd-session[251895]: Failed password for root from 107.172.146.104 port 38508 ssh2
Sep 30 18:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:41 compute-1 podman[252167]: 2025-09-30 18:00:41.532762884 +0000 UTC m=+0.073969641 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Sep 30 18:00:41 compute-1 sshd-session[251895]: Received disconnect from 107.172.146.104 port 38508:11: Bye Bye [preauth]
Sep 30 18:00:41 compute-1 sshd-session[251895]: Disconnected from authenticating user root 107.172.146.104 port 38508 [preauth]
Sep 30 18:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:41 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:42.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:42.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:42 compute-1 ceph-mon[75484]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:43 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:44 compute-1 sshd-session[252189]: Invalid user geoserver from 194.107.115.65 port 37002
Sep 30 18:00:44 compute-1 sshd-session[252189]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:44 compute-1 sshd-session[252189]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:00:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:44.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:44.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:44 compute-1 ceph-mon[75484]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:45 compute-1 sudo[252196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:00:45 compute-1 sudo[252196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:00:45 compute-1 sudo[252196]: pam_unix(sudo:session): session closed for user root
Sep 30 18:00:45 compute-1 sshd-session[252189]: Failed password for invalid user geoserver from 194.107.115.65 port 37002 ssh2
Sep 30 18:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:45 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:46.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:46.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:46 compute-1 ceph-mon[75484]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:47 compute-1 sshd-session[252189]: Received disconnect from 194.107.115.65 port 37002:11: Bye Bye [preauth]
Sep 30 18:00:47 compute-1 sshd-session[252189]: Disconnected from invalid user geoserver 194.107.115.65 port 37002 [preauth]
Sep 30 18:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:47 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:48 compute-1 ceph-mon[75484]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:49 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:50.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:50 compute-1 ceph-mon[75484]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:51 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:52.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:52.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:52 compute-1 ceph-mon[75484]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:53 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:00:54.307 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:00:54.311 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:00:54.311 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:00:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:00:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:54.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:00:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:54 compute-1 podman[252233]: 2025-09-30 18:00:54.579316773 +0000 UTC m=+0.121003963 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 18:00:54 compute-1 sshd-session[252229]: Invalid user wifi from 103.153.190.105 port 56126
Sep 30 18:00:54 compute-1 sshd-session[252229]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:00:54 compute-1 sshd-session[252229]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:00:54 compute-1 ceph-mon[75484]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:55 compute-1 podman[252261]: 2025-09-30 18:00:55.498525491 +0000 UTC m=+0.053879018 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:55 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:56.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:56 compute-1 ceph-mon[75484]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:57 compute-1 sshd-session[252229]: Failed password for invalid user wifi from 103.153.190.105 port 56126 ssh2
Sep 30 18:00:57 compute-1 sshd-session[252229]: Received disconnect from 103.153.190.105 port 56126:11: Bye Bye [preauth]
Sep 30 18:00:57 compute-1 sshd-session[252229]: Disconnected from invalid user wifi 103.153.190.105 port 56126 [preauth]
Sep 30 18:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:57 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:57 compute-1 unix_chkpwd[252292]: password check failed for user (root)
Sep 30 18:00:57 compute-1 sshd-session[252289]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:00:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:00:58.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:00:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:00:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:00:58.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:00:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:00:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:58 compute-1 ceph-mon[75484]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:00:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:00:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:00:59 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:00:59 compute-1 sshd-session[252289]: Failed password for root from 192.210.160.141 port 60644 ssh2
Sep 30 18:01:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:00.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:00.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:00 compute-1 sshd-session[252295]: Invalid user superadmin from 84.51.43.58 port 38660
Sep 30 18:01:00 compute-1 sshd-session[252295]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:01:00 compute-1 sshd-session[252295]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:01:00 compute-1 ceph-mon[75484]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:01:01 compute-1 sshd-session[252289]: Connection closed by authenticating user root 192.210.160.141 port 60644 [preauth]
Sep 30 18:01:01 compute-1 CROND[252300]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 18:01:01 compute-1 run-parts[252303]: (/etc/cron.hourly) starting 0anacron
Sep 30 18:01:01 compute-1 run-parts[252309]: (/etc/cron.hourly) finished 0anacron
Sep 30 18:01:01 compute-1 CROND[252299]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 18:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:01 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:02.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:02.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:02 compute-1 sudo[252447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmkxadfhkgplyuzuxiruqsyufvtfytz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255262.097448-1501-73443507624863/AnsiballZ_podman_container_info.py'
Sep 30 18:01:02 compute-1 sudo[252447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:02 compute-1 podman[252410]: 2025-09-30 18:01:02.575156681 +0000 UTC m=+0.128592618 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Sep 30 18:01:02 compute-1 sshd-session[252295]: Failed password for invalid user superadmin from 84.51.43.58 port 38660 ssh2
Sep 30 18:01:02 compute-1 python3.9[252458]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Sep 30 18:01:02 compute-1 sudo[252447]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:02 compute-1 ceph-mon[75484]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:03 compute-1 sudo[252625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hshacdjyjcsprkaeikkrexzmbvpiqqfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255263.1342623-1509-241958427104618/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:03 compute-1 sudo[252625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:03 compute-1 python3.9[252627]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:03 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:03 compute-1 systemd[1]: Started libpod-conmon-93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc.scope.
Sep 30 18:01:03 compute-1 podman[252628]: 2025-09-30 18:01:03.919352511 +0000 UTC m=+0.125672640 container exec 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:01:03 compute-1 podman[252628]: 2025-09-30 18:01:03.96036616 +0000 UTC m=+0.166686219 container exec_died 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:01:04 compute-1 systemd[1]: libpod-conmon-93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc.scope: Deactivated successfully.
Sep 30 18:01:04 compute-1 sudo[252625]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:04 compute-1 sshd-session[252295]: Received disconnect from 84.51.43.58 port 38660:11: Bye Bye [preauth]
Sep 30 18:01:04 compute-1 sshd-session[252295]: Disconnected from invalid user superadmin 84.51.43.58 port 38660 [preauth]
Sep 30 18:01:04 compute-1 sudo[252811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gekbuxsgpgnysqwnbtjqbrjfejihrwta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255264.2573938-1517-211314687013317/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:04 compute-1 sudo[252811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:04 compute-1 unix_chkpwd[252815]: password check failed for user (root)
Sep 30 18:01:04 compute-1 sshd-session[252599]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:01:04 compute-1 python3.9[252813]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:04 compute-1 systemd[1]: Started libpod-conmon-93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc.scope.
Sep 30 18:01:04 compute-1 podman[252816]: 2025-09-30 18:01:04.953402665 +0000 UTC m=+0.109128092 container exec 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Sep 30 18:01:04 compute-1 podman[252816]: 2025-09-30 18:01:04.990189929 +0000 UTC m=+0.145915386 container exec_died 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Sep 30 18:01:05 compute-1 systemd[1]: libpod-conmon-93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc.scope: Deactivated successfully.
Sep 30 18:01:05 compute-1 sudo[252811]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:05 compute-1 ceph-mon[75484]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:05 compute-1 sudo[252998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txjknvojcbuxykhxdtyclvveqrnumqvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255265.2675674-1525-92395007218417/AnsiballZ_file.py'
Sep 30 18:01:05 compute-1 sudo[252998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:05 compute-1 sudo[253001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:01:05 compute-1 sudo[253001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:05 compute-1 sudo[253001]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:05 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:05 compute-1 python3.9[253000]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:05 compute-1 sudo[252998]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:06 compute-1 ceph-mon[75484]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:06.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:06.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:06 compute-1 sshd-session[252287]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:01:06 compute-1 sshd-session[252287]: banner exchange: Connection from 113.249.93.94 port 16818: Connection timed out
Sep 30 18:01:06 compute-1 sudo[253176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrmnjvuonmnunqndxipuputhjcbizuhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255266.1301873-1534-4896306322159/AnsiballZ_podman_container_info.py'
Sep 30 18:01:06 compute-1 sudo[253176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:06 compute-1 sshd-session[252599]: Failed password for root from 14.225.167.110 port 37330 ssh2
Sep 30 18:01:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:06 compute-1 python3.9[253178]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Sep 30 18:01:06 compute-1 sudo[253176]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:07 compute-1 sudo[253342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjcszhwcqadjkwkafiqpfdbyiewaecyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255266.946128-1542-165111363123390/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:07 compute-1 sudo[253342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:07 compute-1 python3.9[253344]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:07 compute-1 systemd[1]: Started libpod-conmon-64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09.scope.
Sep 30 18:01:07 compute-1 podman[253345]: 2025-09-30 18:01:07.637818837 +0000 UTC m=+0.105741150 container exec 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Sep 30 18:01:07 compute-1 podman[253345]: 2025-09-30 18:01:07.673173133 +0000 UTC m=+0.141095396 container exec_died 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Sep 30 18:01:07 compute-1 sshd-session[252599]: Received disconnect from 14.225.167.110 port 37330:11: Bye Bye [preauth]
Sep 30 18:01:07 compute-1 sshd-session[252599]: Disconnected from authenticating user root 14.225.167.110 port 37330 [preauth]
Sep 30 18:01:07 compute-1 sudo[253342]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:07 compute-1 systemd[1]: libpod-conmon-64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09.scope: Deactivated successfully.
Sep 30 18:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:07 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:08 compute-1 sudo[253528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkikzwxztdshuqkgquylaunvohqbttdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255267.9256482-1550-73540062560606/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:08 compute-1 sudo[253528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:08.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:08 compute-1 ceph-mon[75484]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:08 compute-1 python3.9[253530]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:08 compute-1 systemd[1]: Started libpod-conmon-64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09.scope.
Sep 30 18:01:08 compute-1 podman[253531]: 2025-09-30 18:01:08.688700856 +0000 UTC m=+0.101459755 container exec 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent)
Sep 30 18:01:08 compute-1 podman[253531]: 2025-09-30 18:01:08.721449382 +0000 UTC m=+0.134208281 container exec_died 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent)
Sep 30 18:01:08 compute-1 sudo[253528]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:08 compute-1 systemd[1]: libpod-conmon-64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09.scope: Deactivated successfully.
Sep 30 18:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:09 compute-1 sudo[253715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obenyxmtvjqqtwvbrwaxurmvqmbuxhry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255269.037888-1558-181027888285959/AnsiballZ_file.py'
Sep 30 18:01:09 compute-1 sudo[253715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:09 compute-1 python3.9[253717]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:09 compute-1 sudo[253715]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:09 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:10 compute-1 sudo[253899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xawjjqjnviyqjngnarfpobdppkzbrtux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255269.8927188-1567-281047804688777/AnsiballZ_podman_container_info.py'
Sep 30 18:01:10 compute-1 sudo[253899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:10 compute-1 podman[253842]: 2025-09-30 18:01:10.299471695 +0000 UTC m=+0.089428759 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:01:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:10.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:10 compute-1 podman[253843]: 2025-09-30 18:01:10.320252837 +0000 UTC m=+0.112640917 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Sep 30 18:01:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:10.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:10 compute-1 python3.9[253910]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Sep 30 18:01:10 compute-1 sudo[253899]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:10 compute-1 ceph-mon[75484]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:01:11 compute-1 sudo[254075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somldcjdcqfsyacgecbquxbapnypiapp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255270.8145037-1575-272479041207278/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:11 compute-1 sudo[254075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:11 compute-1 python3.9[254077]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:11 compute-1 systemd[1]: Started libpod-conmon-0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff.scope.
Sep 30 18:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:11 compute-1 podman[254078]: 2025-09-30 18:01:11.501702316 +0000 UTC m=+0.108717431 container exec 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:01:11 compute-1 podman[254078]: 2025-09-30 18:01:11.538512152 +0000 UTC m=+0.145527297 container exec_died 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:01:11 compute-1 systemd[1]: libpod-conmon-0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff.scope: Deactivated successfully.
Sep 30 18:01:11 compute-1 sudo[254075]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:11 compute-1 podman[254111]: 2025-09-30 18:01:11.710464452 +0000 UTC m=+0.088545026 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:11 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:12 compute-1 sudo[254281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wylwngpmdkwnouhusvguwtooetruveix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255271.7931647-1583-22474999035860/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:12 compute-1 sudo[254281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:12.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:12 compute-1 python3.9[254283]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:12.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:12 compute-1 systemd[1]: Started libpod-conmon-0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff.scope.
Sep 30 18:01:12 compute-1 podman[254284]: 2025-09-30 18:01:12.459897908 +0000 UTC m=+0.099006328 container exec 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:01:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:12 compute-1 podman[254284]: 2025-09-30 18:01:12.499071588 +0000 UTC m=+0.138179938 container exec_died 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid)
Sep 30 18:01:12 compute-1 systemd[1]: libpod-conmon-0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff.scope: Deactivated successfully.
Sep 30 18:01:12 compute-1 sudo[254281]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:12 compute-1 ceph-mon[75484]: pgmap v663: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:13 compute-1 sudo[254465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tllospxlfpoesjpsjntvylewoqhznten ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255272.7402537-1591-181536678937901/AnsiballZ_file.py'
Sep 30 18:01:13 compute-1 sudo[254465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:13 compute-1 python3.9[254467]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:13 compute-1 sudo[254465]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f360400a880 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:13 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:13 compute-1 sudo[254618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbygjhctuwjhxuospxofhvzwczgjpkia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255273.498802-1600-241187833355433/AnsiballZ_podman_container_info.py'
Sep 30 18:01:13 compute-1 sudo[254618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:14 compute-1 python3.9[254620]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Sep 30 18:01:14 compute-1 sudo[254618]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:14.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:14.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:14 compute-1 sudo[254785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvzchldmkzifonubxkxrsmiipdmfrcwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255274.4607148-1608-197062166338058/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:14 compute-1 sudo[254785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:14 compute-1 ceph-mon[75484]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:14 compute-1 python3.9[254787]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:15 compute-1 systemd[1]: Started libpod-conmon-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.scope.
Sep 30 18:01:15 compute-1 podman[254788]: 2025-09-30 18:01:15.119925232 +0000 UTC m=+0.108045123 container exec 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 18:01:15 compute-1 podman[254808]: 2025-09-30 18:01:15.212017422 +0000 UTC m=+0.068906224 container exec_died 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 18:01:15 compute-1 podman[254788]: 2025-09-30 18:01:15.219127174 +0000 UTC m=+0.207247055 container exec_died 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Sep 30 18:01:15 compute-1 systemd[1]: libpod-conmon-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.scope: Deactivated successfully.
Sep 30 18:01:15 compute-1 sudo[254785]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:15 compute-1 sudo[254973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxusbpilsirgfgfhfqhjmxcqyxzjvqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255275.453167-1616-253164770310146/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:15 compute-1 sudo[254973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:15 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:16 compute-1 python3.9[254975]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:16 compute-1 systemd[1]: Started libpod-conmon-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.scope.
Sep 30 18:01:16 compute-1 podman[254977]: 2025-09-30 18:01:16.136908104 +0000 UTC m=+0.105163145 container exec 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:01:16 compute-1 podman[254977]: 2025-09-30 18:01:16.176164095 +0000 UTC m=+0.144419116 container exec_died 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:01:16 compute-1 systemd[1]: libpod-conmon-84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7.scope: Deactivated successfully.
Sep 30 18:01:16 compute-1 sudo[254973]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:16.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:16 compute-1 nova_compute[238822]: 2025-09-30 18:01:16.805 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:16 compute-1 nova_compute[238822]: 2025-09-30 18:01:16.805 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:16 compute-1 ceph-mon[75484]: pgmap v665: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:16 compute-1 sudo[255159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qipuzwetopfuvnjcgpdzcbmnawkgyqrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255276.4549377-1624-50900383276778/AnsiballZ_file.py'
Sep 30 18:01:16 compute-1 sudo[255159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:17 compute-1 python3.9[255161]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:17 compute-1 sudo[255159]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.321 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.322 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.322 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.322 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.322 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.323 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.323 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.324 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:17 compute-1 sudo[255311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifydlmieuxgaqpjtjabnisspwiabcyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255277.3573108-1633-153774486314031/AnsiballZ_podman_container_info.py'
Sep 30 18:01:17 compute-1 sudo[255311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:17 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.838 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.839 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.839 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.840 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:01:17 compute-1 nova_compute[238822]: 2025-09-30 18:01:17.840 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:01:17 compute-1 sshd-session[254468]: Connection closed by 101.126.25.120 port 42186 [preauth]
Sep 30 18:01:17 compute-1 python3.9[255313]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Sep 30 18:01:18 compute-1 sudo[255311]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:18.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:01:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1893667914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:18.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.360 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:01:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.591 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.593 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.625 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.626 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5132MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.627 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:01:18 compute-1 nova_compute[238822]: 2025-09-30 18:01:18.628 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:01:18 compute-1 sudo[255500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhdzzpwaprbksqwjvpcggwmuzmugtoyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255278.3168721-1641-121696752584132/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:18 compute-1 sudo[255500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:18 compute-1 ceph-mon[75484]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1893667914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4281996859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:18 compute-1 python3.9[255502]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:19 compute-1 systemd[1]: Started libpod-conmon-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.scope.
Sep 30 18:01:19 compute-1 podman[255503]: 2025-09-30 18:01:19.03114997 +0000 UTC m=+0.115365450 container exec fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:01:19 compute-1 podman[255503]: 2025-09-30 18:01:19.06923574 +0000 UTC m=+0.153451160 container exec_died fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:01:19 compute-1 systemd[1]: libpod-conmon-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.scope: Deactivated successfully.
Sep 30 18:01:19 compute-1 sudo[255500]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:19 compute-1 nova_compute[238822]: 2025-09-30 18:01:19.685 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:01:19 compute-1 nova_compute[238822]: 2025-09-30 18:01:19.685 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:01:18 up  3:38,  0 user,  load average: 0.95, 1.34, 1.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:01:19 compute-1 nova_compute[238822]: 2025-09-30 18:01:19.775 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:19 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:19 compute-1 sudo[255684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aybxmqplpjizhmrmqmxhoqcxximroomb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255279.3286927-1649-103470448986837/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:19 compute-1 sudo[255684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:20 compute-1 python3.9[255686]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:01:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1151663774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:20 compute-1 nova_compute[238822]: 2025-09-30 18:01:20.273 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:01:20 compute-1 nova_compute[238822]: 2025-09-30 18:01:20.281 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:01:20 compute-1 systemd[1]: Started libpod-conmon-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.scope.
Sep 30 18:01:20 compute-1 podman[255707]: 2025-09-30 18:01:20.298372569 +0000 UTC m=+0.152136295 container exec fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:01:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:20.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:20.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:20 compute-1 podman[255730]: 2025-09-30 18:01:20.379862443 +0000 UTC m=+0.070669092 container exec_died fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:01:20 compute-1 podman[255707]: 2025-09-30 18:01:20.43078973 +0000 UTC m=+0.284553536 container exec_died fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:01:20 compute-1 systemd[1]: libpod-conmon-fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0.scope: Deactivated successfully.
Sep 30 18:01:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:20 compute-1 sudo[255684]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:20 compute-1 nova_compute[238822]: 2025-09-30 18:01:20.790 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:01:20 compute-1 ceph-mon[75484]: pgmap v667: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:01:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1151663774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3336069450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:01:21 compute-1 sudo[255893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kutbnfhcnfzzttcjhpvlwhispztamxiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255280.7727184-1657-5372197529635/AnsiballZ_file.py'
Sep 30 18:01:21 compute-1 sudo[255893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:21 compute-1 unix_chkpwd[255896]: password check failed for user (root)
Sep 30 18:01:21 compute-1 sshd-session[255632]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:01:21 compute-1 nova_compute[238822]: 2025-09-30 18:01:21.301 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:01:21 compute-1 nova_compute[238822]: 2025-09-30 18:01:21.301 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.673s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:21 compute-1 python3.9[255895]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:21 compute-1 sudo[255893]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:21 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:21 compute-1 sudo[256046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgixvpxwgwjfofqlqqptckxmkcyhhwpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255281.6091087-1666-125787715173331/AnsiballZ_podman_container_info.py'
Sep 30 18:01:21 compute-1 sudo[256046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:22 compute-1 python3.9[256048]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Sep 30 18:01:22 compute-1 sudo[256046]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:22.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:22 compute-1 sudo[256214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vivkvxhhsfnjhvpmthwtynjdtapcjcot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255282.521431-1674-24240919035201/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:22 compute-1 sudo[256214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:22 compute-1 ceph-mon[75484]: pgmap v668: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:01:23 compute-1 python3.9[256216]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:23 compute-1 systemd[1]: Started libpod-conmon-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.scope.
Sep 30 18:01:23 compute-1 podman[256217]: 2025-09-30 18:01:23.217330045 +0000 UTC m=+0.100101318 container exec 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:01:23 compute-1 podman[256217]: 2025-09-30 18:01:23.250492051 +0000 UTC m=+0.133263284 container exec_died 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Sep 30 18:01:23 compute-1 systemd[1]: libpod-conmon-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.scope: Deactivated successfully.
Sep 30 18:01:23 compute-1 sudo[256214]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:23 compute-1 sshd-session[255632]: Failed password for root from 192.210.160.141 port 41258 ssh2
Sep 30 18:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:23 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:23 compute-1 sudo[256398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsrzuuxnfurbqnfhwlpxnedxgfqtztuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255283.5087667-1682-224413069314352/AnsiballZ_podman_container_exec.py'
Sep 30 18:01:23 compute-1 sudo[256398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:24 compute-1 python3.9[256400]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 18:01:24 compute-1 systemd[1]: Started libpod-conmon-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.scope.
Sep 30 18:01:24 compute-1 podman[256402]: 2025-09-30 18:01:24.230571815 +0000 UTC m=+0.108370821 container exec 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Sep 30 18:01:24 compute-1 podman[256402]: 2025-09-30 18:01:24.265834019 +0000 UTC m=+0.143632935 container exec_died 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:01:24 compute-1 sshd-session[255632]: Connection closed by authenticating user root 192.210.160.141 port 41258 [preauth]
Sep 30 18:01:24 compute-1 systemd[1]: libpod-conmon-53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581.scope: Deactivated successfully.
Sep 30 18:01:24 compute-1 sudo[256398]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:24.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:24.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:24 compute-1 ceph-mon[75484]: pgmap v669: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:25 compute-1 sudo[256597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvtngxvpgqgjxbydsujeikaesjboekjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255284.618638-1690-65055189413784/AnsiballZ_file.py'
Sep 30 18:01:25 compute-1 sudo[256597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:25 compute-1 podman[256558]: 2025-09-30 18:01:25.091952089 +0000 UTC m=+0.164376157 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:01:25 compute-1 python3.9[256609]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:25 compute-1 sudo[256597]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:25 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:25 compute-1 sudo[256714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:01:25 compute-1 sudo[256714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:25 compute-1 sudo[256714]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:25 compute-1 sudo[256800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwodyzrpcymsnfkudizlyjmfonrkhup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255285.5686162-1701-236259007455659/AnsiballZ_file.py'
Sep 30 18:01:25 compute-1 sudo[256800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:25 compute-1 podman[256762]: 2025-09-30 18:01:25.927522835 +0000 UTC m=+0.079585904 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:01:26 compute-1 python3.9[256813]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:26 compute-1 sudo[256800]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:26.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:26.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:26 compute-1 sudo[256965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jejfvqglexgonuxwvqddyotkzbqxrhmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255286.3594744-1717-240919138784765/AnsiballZ_stat.py'
Sep 30 18:01:26 compute-1 sudo[256965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:26 compute-1 ceph-mon[75484]: pgmap v670: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:26 compute-1 python3.9[256967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:27 compute-1 sudo[256965]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35f80043a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:27 compute-1 sudo[257090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxhtofzocielccbwuopharucplozbzqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255286.3594744-1717-240919138784765/AnsiballZ_copy.py'
Sep 30 18:01:27 compute-1 sudo[257090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:27 compute-1 python3.9[257092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759255286.3594744-1717-240919138784765/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:27 compute-1 sudo[257090]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:27 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:28.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:28 compute-1 sshd-session[257015]: Invalid user titu from 175.126.165.170 port 37670
Sep 30 18:01:28 compute-1 sshd-session[257015]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:01:28 compute-1 sshd-session[257015]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:01:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:28.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:28 compute-1 sudo[257243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-malapowqrfrieelkexpttayxbvjznnmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255288.0830271-1749-123289441462855/AnsiballZ_file.py'
Sep 30 18:01:28 compute-1 sudo[257243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:28 compute-1 python3.9[257245]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:28 compute-1 sudo[257243]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:28 compute-1 ceph-mon[75484]: pgmap v671: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:29 compute-1 sudo[257396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zytiuuzkxjmuqxotdvfzqssvbcyfemys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255288.9564466-1765-80921437648301/AnsiballZ_stat.py'
Sep 30 18:01:29 compute-1 sudo[257396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:29 compute-1 python3.9[257398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:29 compute-1 sudo[257396]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:29 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:29 compute-1 sudo[257474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdisxorknlbbvvulboidubiugnkmcmwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255288.9564466-1765-80921437648301/AnsiballZ_file.py'
Sep 30 18:01:29 compute-1 sudo[257474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:30 compute-1 sshd-session[257015]: Failed password for invalid user titu from 175.126.165.170 port 37670 ssh2
Sep 30 18:01:30 compute-1 python3.9[257476]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:30 compute-1 sudo[257474]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:30.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:30.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:30 compute-1 sudo[257628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikjedkawnmdtlcnewbekdeaismkhertv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255290.5313675-1791-222606270397459/AnsiballZ_stat.py'
Sep 30 18:01:30 compute-1 sudo[257628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:31 compute-1 ceph-mon[75484]: pgmap v672: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:01:31 compute-1 python3.9[257630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:31 compute-1 sudo[257628]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:31 compute-1 sshd-session[257015]: Received disconnect from 175.126.165.170 port 37670:11: Bye Bye [preauth]
Sep 30 18:01:31 compute-1 sshd-session[257015]: Disconnected from invalid user titu 175.126.165.170 port 37670 [preauth]
Sep 30 18:01:31 compute-1 sudo[257633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:01:31 compute-1 sudo[257633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:31 compute-1 sudo[257633]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:31 compute-1 sudo[257681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:01:31 compute-1 sudo[257681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:31 compute-1 sudo[257756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkxtsefdsvsbynjlztxbgzaclspwftln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255290.5313675-1791-222606270397459/AnsiballZ_file.py'
Sep 30 18:01:31 compute-1 sudo[257756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:31 compute-1 python3.9[257758]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.9f31ponv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:31 compute-1 sudo[257756]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:31 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3600001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:01:32 compute-1 podman[257883]: 2025-09-30 18:01:32.055281204 +0000 UTC m=+0.115582127 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 18:01:32 compute-1 podman[257883]: 2025-09-30 18:01:32.261262434 +0000 UTC m=+0.321563307 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 18:01:32 compute-1 sudo[258012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjspbgkaxxlffwepylezfoskuenqzdir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255291.890013-1813-90364598823465/AnsiballZ_stat.py'
Sep 30 18:01:32 compute-1 sudo[258012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:32.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:32.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:32 compute-1 python3.9[258016]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:32 compute-1 sudo[258012]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:32 compute-1 podman[258066]: 2025-09-30 18:01:32.70189149 +0000 UTC m=+0.085712959 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:01:32 compute-1 sudo[258181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdvltpbdzjtatczsxqgupkpaxqfgijxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255291.890013-1813-90364598823465/AnsiballZ_file.py'
Sep 30 18:01:32 compute-1 sudo[258181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:32 compute-1 podman[258196]: 2025-09-30 18:01:32.986002673 +0000 UTC m=+0.100811507 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:01:33 compute-1 podman[258196]: 2025-09-30 18:01:33.023135407 +0000 UTC m=+0.137944251 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:01:33 compute-1 ceph-mon[75484]: pgmap v673: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:33 compute-1 python3.9[258192]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:33 compute-1 sudo[258181]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35e0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:33 compute-1 podman[258365]: 2025-09-30 18:01:33.554565688 +0000 UTC m=+0.089688726 container exec 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 18:01:33 compute-1 podman[258365]: 2025-09-30 18:01:33.582889354 +0000 UTC m=+0.118012392 container exec_died 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Sep 30 18:01:33 compute-1 sudo[258477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbpptgtjxpetloddeozfdxgysyxtrhfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255293.328797-1839-264108703940571/AnsiballZ_command.py'
Sep 30 18:01:33 compute-1 sudo[258477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:33 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:01:33 compute-1 python3.9[258489]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 18:01:33 compute-1 podman[258507]: 2025-09-30 18:01:33.95252006 +0000 UTC m=+0.096547462 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:01:33 compute-1 podman[258507]: 2025-09-30 18:01:33.987106595 +0000 UTC m=+0.131133997 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:01:33 compute-1 sudo[258477]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:34 compute-1 podman[258606]: 2025-09-30 18:01:34.309804922 +0000 UTC m=+0.094544248 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.openshift.expose-services=, name=keepalived, release=1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64)
Sep 30 18:01:34 compute-1 podman[258606]: 2025-09-30 18:01:34.332225498 +0000 UTC m=+0.116964874 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, name=keepalived, release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Sep 30 18:01:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:34.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:34.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:34 compute-1 sshd[170789]: drop connection #0 from [14.103.129.43]:35008 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:01:34 compute-1 sudo[257681]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:34 compute-1 sudo[258770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:01:34 compute-1 sudo[258770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:34 compute-1 sudo[258770]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:34 compute-1 sudo[258818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqnyplswrqsipwrjapeeqhqxwzslksvk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759255294.2505748-1855-84417813795964/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 18:01:34 compute-1 sudo[258818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:34 compute-1 sudo[258822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:01:34 compute-1 sudo[258822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:35 compute-1 python3[258826]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 18:01:35 compute-1 ceph-mon[75484]: pgmap v674: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:35 compute-1 sudo[258818]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:35 compute-1 kernel: ganesha.nfsd[252191]: segfault at 50 ip 00007f36b1f6132e sp 00007f3680ff8210 error 4 in libntirpc.so.5.8[7f36b1f46000+2c000] likely on CPU 4 (core 0, socket 4)
Sep 30 18:01:35 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:01:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[215214]: 30/09/2025 18:01:35 : epoch 68dc19cf : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f35d00045b0 fd 38 proxy ignored for local
Sep 30 18:01:35 compute-1 systemd[1]: Started Process Core Dump (PID 258916/UID 0).
Sep 30 18:01:35 compute-1 sudo[258822]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:35 compute-1 sshd-session[258930]: Invalid user debian from 107.172.146.104 port 48342
Sep 30 18:01:35 compute-1 sshd-session[258930]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:01:35 compute-1 sshd-session[258930]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:01:35 compute-1 sudo[259034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ionsxzxkbzrnjeojshtbbvunhlfwrnrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255295.3105078-1871-270853384067263/AnsiballZ_stat.py'
Sep 30 18:01:35 compute-1 sudo[259034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:35 compute-1 python3.9[259036]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:35 compute-1 sudo[259034]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:01:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:01:36 compute-1 sudo[259113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egsjtlpxjcsrptxsqiwdrdmdopjuiwui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255295.3105078-1871-270853384067263/AnsiballZ_file.py'
Sep 30 18:01:36 compute-1 sudo[259113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:36.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:36.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:36 compute-1 python3.9[259115]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:36 compute-1 sudo[259113]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:36 compute-1 systemd-coredump[258925]: Process 215241 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 70:
                                                    #0  0x00007f36b1f6132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:01:36 compute-1 systemd[1]: systemd-coredump@8-258916-0.service: Deactivated successfully.
Sep 30 18:01:36 compute-1 systemd[1]: systemd-coredump@8-258916-0.service: Consumed 1.233s CPU time.
Sep 30 18:01:36 compute-1 podman[259168]: 2025-09-30 18:01:36.736889866 +0000 UTC m=+0.053165519 container died 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:01:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-e0d211df535c77d24b6b9bb14c26b41ba241a3840c5342297aecc0c2460f1702-merged.mount: Deactivated successfully.
Sep 30 18:01:36 compute-1 podman[259168]: 2025-09-30 18:01:36.791302837 +0000 UTC m=+0.107578440 container remove 75b1efab8f0d7c03b4b95c7c54d4f3b6e4f899f15c2cf2a4f305e0e7dd21f9dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 18:01:36 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:01:36 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:01:36 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.274s CPU time.
Sep 30 18:01:37 compute-1 sudo[259311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ficdalubisjbqyifatkpbvgrhwzxeqsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255296.6737432-1895-277092645961461/AnsiballZ_stat.py'
Sep 30 18:01:37 compute-1 sudo[259311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:37 compute-1 ceph-mon[75484]: pgmap v675: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:37 compute-1 python3.9[259313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:37 compute-1 sudo[259311]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:37 compute-1 sshd-session[258930]: Failed password for invalid user debian from 107.172.146.104 port 48342 ssh2
Sep 30 18:01:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:37 compute-1 sudo[259389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srdzroviwvmootczxrlhwytytisaxggu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255296.6737432-1895-277092645961461/AnsiballZ_file.py'
Sep 30 18:01:37 compute-1 sudo[259389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:37 compute-1 sshd-session[258930]: Received disconnect from 107.172.146.104 port 48342:11: Bye Bye [preauth]
Sep 30 18:01:37 compute-1 sshd-session[258930]: Disconnected from invalid user debian 107.172.146.104 port 48342 [preauth]
Sep 30 18:01:37 compute-1 python3.9[259391]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:37 compute-1 sudo[259389]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:01:38 compute-1 ceph-mon[75484]: pgmap v676: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:38.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:38.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:38 compute-1 sudo[259542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bghhaoccreytgtccbsfdpknnpkbetkkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255298.0475717-1919-22521450468226/AnsiballZ_stat.py'
Sep 30 18:01:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:38 compute-1 sudo[259542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:38 compute-1 python3.9[259544]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:38 compute-1 sudo[259542]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:38 compute-1 sudo[259621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmkdopqixokvpyizxhsxgrhvonznsajv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255298.0475717-1919-22521450468226/AnsiballZ_file.py'
Sep 30 18:01:38 compute-1 sudo[259621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:39 compute-1 python3.9[259623]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:39 compute-1 sudo[259621]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:39 compute-1 sudo[259773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbafmcbskbqafiqhtejzgyxworlhlxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255299.4245546-1943-99046944792443/AnsiballZ_stat.py'
Sep 30 18:01:39 compute-1 sudo[259773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:40 compute-1 python3.9[259775]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:40 compute-1 sudo[259773]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:40.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:40 compute-1 sudo[259852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvvbgnrsuhuuzscmpzhsivymzvzhjgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255299.4245546-1943-99046944792443/AnsiballZ_file.py'
Sep 30 18:01:40 compute-1 sudo[259852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:40.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:40 compute-1 podman[259854]: 2025-09-30 18:01:40.454594552 +0000 UTC m=+0.073468828 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 18:01:40 compute-1 podman[259855]: 2025-09-30 18:01:40.474307215 +0000 UTC m=+0.095263007 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Sep 30 18:01:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:40 compute-1 python3.9[259856]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:40 compute-1 sudo[259852]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:40 compute-1 ceph-mon[75484]: pgmap v677: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:01:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:01:40 compute-1 sudo[259940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:01:40 compute-1 sudo[259940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:40 compute-1 sudo[259940]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:41 compute-1 sudo[260067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sevujovtrpkjbyptjpzyygtqzaykktqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255300.81711-1967-200701347686638/AnsiballZ_stat.py'
Sep 30 18:01:41 compute-1 sudo[260067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180141 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:01:41 compute-1 python3.9[260069]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 18:01:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:41 compute-1 sudo[260067]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:42 compute-1 sudo[260206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbiiijxzfuunozqcaboowhjlhmxvrpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255300.81711-1967-200701347686638/AnsiballZ_copy.py'
Sep 30 18:01:42 compute-1 sudo[260206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:42 compute-1 podman[260166]: 2025-09-30 18:01:42.090143862 +0000 UTC m=+0.102883024 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:01:42 compute-1 python3.9[260213]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759255300.81711-1967-200701347686638/.source.nft follow=False _original_basename=ruleset.j2 checksum=bc835bd485c96b4ac7465e87d3a790a8d097f2aa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:42 compute-1 sudo[260206]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:42.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:42.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:42 compute-1 ceph-mon[75484]: pgmap v678: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:42 compute-1 sudo[260365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bruzurjmkhhwtgyawxyqumghvgzdfkuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255302.5365605-1997-252810897660588/AnsiballZ_file.py'
Sep 30 18:01:42 compute-1 sudo[260365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:43 compute-1 python3.9[260367]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:43 compute-1 sudo[260365]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:43 compute-1 sudo[260518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-menhinudyqcofkyxhsxuowiarinddqfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255303.3996563-2013-199802994792/AnsiballZ_command.py'
Sep 30 18:01:43 compute-1 sudo[260518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:44 compute-1 python3.9[260520]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 18:01:44 compute-1 sudo[260518]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:44.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:44.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:44 compute-1 sudo[260676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owcsisbykbgpitnshwmzetnuutozeoic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255304.3704548-2029-164691262213892/AnsiballZ_blockinfile.py'
Sep 30 18:01:44 compute-1 sudo[260676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:44 compute-1 ceph-mon[75484]: pgmap v679: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:44 compute-1 python3.9[260678]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:45 compute-1 sudo[260676]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:45 compute-1 sshd-session[260418]: Invalid user teste from 192.210.160.141 port 59784
Sep 30 18:01:45 compute-1 sshd-session[260418]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:01:45 compute-1 sshd-session[260418]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:01:45 compute-1 sudo[260828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxvuthmejukytoyzyaizblvqaolvkmva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255305.3494325-2047-154171642489601/AnsiballZ_command.py'
Sep 30 18:01:45 compute-1 sudo[260828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:45 compute-1 sudo[260831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:01:45 compute-1 sudo[260831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:01:45 compute-1 python3.9[260830]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 18:01:45 compute-1 sudo[260831]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:45 compute-1 sudo[260828]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:46.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:46.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:46 compute-1 sudo[261007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iksdewpmfvugnimrpnqobinrvwgtkvfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255306.2145364-2063-237135221625887/AnsiballZ_stat.py'
Sep 30 18:01:46 compute-1 sudo[261007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:46 compute-1 python3.9[261009]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 18:01:46 compute-1 sudo[261007]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:46 compute-1 ceph-mon[75484]: pgmap v680: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:47 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 9.
Sep 30 18:01:47 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:01:47 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.274s CPU time.
Sep 30 18:01:47 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 18:01:47 compute-1 sudo[261204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usuojeimmhowrxydpwdduzmcgoszpnmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255307.0853825-2079-82150427445050/AnsiballZ_command.py'
Sep 30 18:01:47 compute-1 sudo[261204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:47 compute-1 podman[261212]: 2025-09-30 18:01:47.597435011 +0000 UTC m=+0.110000486 container create 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:01:47 compute-1 podman[261212]: 2025-09-30 18:01:47.529998047 +0000 UTC m=+0.042563552 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:01:47 compute-1 sshd-session[260418]: Failed password for invalid user teste from 192.210.160.141 port 59784 ssh2
Sep 30 18:01:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/effca0547c373f0bf2290f0cdca7cf7751ca3a3e6b6e45e5e101d8918052ea5a/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 18:01:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/effca0547c373f0bf2290f0cdca7cf7751ca3a3e6b6e45e5e101d8918052ea5a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:01:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/effca0547c373f0bf2290f0cdca7cf7751ca3a3e6b6e45e5e101d8918052ea5a/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:01:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/effca0547c373f0bf2290f0cdca7cf7751ca3a3e6b6e45e5e101d8918052ea5a/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:01:47 compute-1 python3.9[261211]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 18:01:47 compute-1 podman[261212]: 2025-09-30 18:01:47.683426856 +0000 UTC m=+0.195992371 container init 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Sep 30 18:01:47 compute-1 podman[261212]: 2025-09-30 18:01:47.699146591 +0000 UTC m=+0.211712066 container start 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Sep 30 18:01:47 compute-1 bash[261212]: 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2
Sep 30 18:01:47 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 18:01:47 compute-1 sudo[261204]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 18:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:01:48 compute-1 sudo[261422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smypirlpmbqmhmdlucuohqowanuwscft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759255307.9577732-2095-27003181841323/AnsiballZ_file.py'
Sep 30 18:01:48 compute-1 sudo[261422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:01:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:48.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:48.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:48 compute-1 python3.9[261424]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 18:01:48 compute-1 sudo[261422]: pam_unix(sudo:session): session closed for user root
Sep 30 18:01:48 compute-1 ceph-mon[75484]: pgmap v681: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:01:49 compute-1 sshd-session[239200]: Connection closed by 192.168.122.30 port 46296
Sep 30 18:01:49 compute-1 sshd-session[239192]: pam_unix(sshd:session): session closed for user zuul
Sep 30 18:01:49 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Sep 30 18:01:49 compute-1 systemd[1]: session-57.scope: Consumed 1min 44.540s CPU time.
Sep 30 18:01:49 compute-1 systemd-logind[789]: Session 57 logged out. Waiting for processes to exit.
Sep 30 18:01:49 compute-1 systemd-logind[789]: Removed session 57.
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: ERROR   18:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: ERROR   18:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: ERROR   18:01:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: ERROR   18:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: ERROR   18:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:01:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:01:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:50 compute-1 sshd-session[260418]: Connection closed by invalid user teste 192.210.160.141 port 59784 [preauth]
Sep 30 18:01:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:50.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:50.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:50 compute-1 ceph-mon[75484]: pgmap v682: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:01:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:52.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:52.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:52 compute-1 ceph-mon[75484]: pgmap v683: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:01:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:01:52 compute-1 unix_chkpwd[261462]: password check failed for user (root)
Sep 30 18:01:52 compute-1 sshd-session[261459]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 18:01:53 compute-1 unix_chkpwd[261463]: password check failed for user (root)
Sep 30 18:01:53 compute-1 sshd-session[261457]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:01:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:01:54.312 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:01:54.314 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:01:54.314 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:01:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:54.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:54.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:54 compute-1 sshd-session[261459]: Failed password for root from 167.172.43.167 port 41940 ssh2
Sep 30 18:01:54 compute-1 ceph-mon[75484]: pgmap v684: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:01:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:55 compute-1 sshd-session[261457]: Failed password for root from 194.107.115.65 port 61470 ssh2
Sep 30 18:01:55 compute-1 podman[261467]: 2025-09-30 18:01:55.633551886 +0000 UTC m=+0.172266459 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:01:55 compute-1 sshd-session[261459]: Received disconnect from 167.172.43.167 port 41940:11: Bye Bye [preauth]
Sep 30 18:01:55 compute-1 sshd-session[261459]: Disconnected from authenticating user root 167.172.43.167 port 41940 [preauth]
Sep 30 18:01:56 compute-1 sshd-session[261457]: Received disconnect from 194.107.115.65 port 61470:11: Bye Bye [preauth]
Sep 30 18:01:56 compute-1 sshd-session[261457]: Disconnected from authenticating user root 194.107.115.65 port 61470 [preauth]
Sep 30 18:01:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:56.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:56.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:56 compute-1 podman[261494]: 2025-09-30 18:01:56.545336353 +0000 UTC m=+0.082016939 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:01:56 compute-1 ceph-mon[75484]: pgmap v685: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:01:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:01:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:01:58.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:01:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:01:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:01:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:01:58.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:01:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:01:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:59 compute-1 ceph-mon[75484]: pgmap v686: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:01:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:01:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 18:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:01:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:02:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:00.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:00.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:01 compute-1 ceph-mon[75484]: pgmap v687: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.5 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c000fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:02.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:02.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:03 compute-1 ceph-mon[75484]: pgmap v688: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 18:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180203 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c000fb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:03 compute-1 podman[261541]: 2025-09-30 18:02:03.543831509 +0000 UTC m=+0.079964504 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:04.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:04.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:05 compute-1 ceph-mon[75484]: pgmap v689: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:05 compute-1 podman[249638]: time="2025-09-30T18:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:02:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:02:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8752 "" "Go-http-client/1.1"
Sep 30 18:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:06 compute-1 sudo[261564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:02:06 compute-1 sudo[261564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:06 compute-1 sudo[261564]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:06 compute-1 ceph-mon[75484]: pgmap v690: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:02:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:06.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:08 compute-1 ceph-mon[75484]: pgmap v691: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:02:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:08.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:08.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:10.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:10.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:10 compute-1 unix_chkpwd[261596]: password check failed for user (root)
Sep 30 18:02:10 compute-1 sshd-session[261593]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:02:10 compute-1 ceph-mon[75484]: pgmap v692: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:11 compute-1 podman[261600]: 2025-09-30 18:02:11.535806209 +0000 UTC m=+0.084263370 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:02:11 compute-1 podman[261601]: 2025-09-30 18:02:11.54323271 +0000 UTC m=+0.080304023 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:12.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:12.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:12 compute-1 podman[261637]: 2025-09-30 18:02:12.543503 +0000 UTC m=+0.085794651 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 18:02:12 compute-1 sshd-session[261598]: Invalid user jerry from 14.225.167.110 port 39726
Sep 30 18:02:12 compute-1 sshd-session[261598]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:02:12 compute-1 sshd-session[261598]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:02:12 compute-1 ceph-mon[75484]: pgmap v693: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:02:12 compute-1 sshd-session[261593]: Failed password for root from 192.210.160.141 port 38312 ssh2
Sep 30 18:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:13 compute-1 sshd-session[261593]: Connection closed by authenticating user root 192.210.160.141 port 38312 [preauth]
Sep 30 18:02:13 compute-1 sshd-session[261658]: Invalid user seekcy from 216.10.242.161 port 39094
Sep 30 18:02:13 compute-1 sshd-session[261658]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:02:13 compute-1 sshd-session[261658]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:14.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:14 compute-1 ceph-mon[75484]: pgmap v694: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:02:15 compute-1 sshd-session[261598]: Failed password for invalid user jerry from 14.225.167.110 port 39726 ssh2
Sep 30 18:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:16 compute-1 sshd-session[261658]: Failed password for invalid user seekcy from 216.10.242.161 port 39094 ssh2
Sep 30 18:02:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:16.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:16.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:16 compute-1 ceph-mon[75484]: pgmap v695: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:16 compute-1 sshd-session[261598]: Received disconnect from 14.225.167.110 port 39726:11: Bye Bye [preauth]
Sep 30 18:02:16 compute-1 sshd-session[261598]: Disconnected from invalid user jerry 14.225.167.110 port 39726 [preauth]
Sep 30 18:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:18 compute-1 sshd-session[261658]: Received disconnect from 216.10.242.161 port 39094:11: Bye Bye [preauth]
Sep 30 18:02:18 compute-1 sshd-session[261658]: Disconnected from invalid user seekcy 216.10.242.161 port 39094 [preauth]
Sep 30 18:02:18 compute-1 ceph-mon[75484]: pgmap v696: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:18.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:18.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: ERROR   18:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: ERROR   18:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: ERROR   18:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: ERROR   18:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: ERROR   18:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:02:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:20.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:20.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:02:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2360249032' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 unix_chkpwd[261671]: password check failed for user (root)
Sep 30 18:02:20 compute-1 sshd-session[261667]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:02:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:02:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2360249032' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:20 compute-1 ceph-mon[75484]: pgmap v697: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/482529736' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/482529736' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/476006598' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/476006598' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2360249032' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2360249032' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1291920182' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1291920182' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.303 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.303 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.303 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.303 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.304 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.304 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.304 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.304 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.304 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640028c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.824 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.824 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.825 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.825 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:02:21 compute-1 nova_compute[238822]: 2025-09-30 18:02:21.825 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:02:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4025706531' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4025706531' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1764238396' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:02:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1764238396' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:02:22 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/278088296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:22 compute-1 sshd-session[261667]: Failed password for root from 84.51.43.58 port 51044 ssh2
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.316 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:02:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:22.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.464 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.465 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:02:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:22.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.485 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.486 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5173MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.486 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:02:22 compute-1 nova_compute[238822]: 2025-09-30 18:02:22.486 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:02:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:22 compute-1 ceph-mon[75484]: pgmap v698: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:02:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/278088296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3258341083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:23 compute-1 sshd-session[261667]: Received disconnect from 84.51.43.58 port 51044:11: Bye Bye [preauth]
Sep 30 18:02:23 compute-1 sshd-session[261667]: Disconnected from authenticating user root 84.51.43.58 port 51044 [preauth]
Sep 30 18:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:23 compute-1 nova_compute[238822]: 2025-09-30 18:02:23.607 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:02:23 compute-1 nova_compute[238822]: 2025-09-30 18:02:23.607 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:02:22 up  3:39,  0 user,  load average: 2.24, 1.80, 1.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:02:23 compute-1 nova_compute[238822]: 2025-09-30 18:02:23.625 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:02:24 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3346942751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:24 compute-1 nova_compute[238822]: 2025-09-30 18:02:24.132 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:02:24 compute-1 nova_compute[238822]: 2025-09-30 18:02:24.139 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:02:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:24.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:24 compute-1 nova_compute[238822]: 2025-09-30 18:02:24.680 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:02:24 compute-1 ceph-mon[75484]: pgmap v699: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3346942751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/234598981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:02:25 compute-1 nova_compute[238822]: 2025-09-30 18:02:25.283 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:02:25 compute-1 nova_compute[238822]: 2025-09-30 18:02:25.284 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.798s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640028c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:26 compute-1 sudo[261723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:02:26 compute-1 sudo[261723]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:26 compute-1 sudo[261723]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:26 compute-1 podman[261747]: 2025-09-30 18:02:26.255958768 +0000 UTC m=+0.115954617 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:02:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:26.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:26.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:26 compute-1 ceph-mon[75484]: pgmap v700: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:27 compute-1 podman[261776]: 2025-09-30 18:02:27.51331091 +0000 UTC m=+0.061724531 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:28 compute-1 PackageKit[173819]: daemon quit
Sep 30 18:02:28 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 18:02:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:02:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:28.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:02:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:28.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:28 compute-1 ceph-mon[75484]: pgmap v701: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640028c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:02:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:30.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:02:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:30.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:30 compute-1 ceph-mon[75484]: pgmap v702: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002080 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:32.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:32.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:32 compute-1 ceph-mon[75484]: pgmap v703: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:34 compute-1 unix_chkpwd[261811]: password check failed for user (root)
Sep 30 18:02:34 compute-1 sshd-session[261805]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:02:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:34.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:34.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:34 compute-1 podman[261812]: 2025-09-30 18:02:34.516167104 +0000 UTC m=+0.061113113 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 18:02:34 compute-1 ceph-mon[75484]: pgmap v704: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:35 compute-1 sshd-session[261832]: Invalid user laravel from 107.172.146.104 port 44450
Sep 30 18:02:35 compute-1 sshd-session[261832]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:02:35 compute-1 sshd-session[261832]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:35 compute-1 podman[249638]: time="2025-09-30T18:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:02:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:02:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8754 "" "Go-http-client/1.1"
Sep 30 18:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:36.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:36 compute-1 sshd-session[261805]: Failed password for root from 192.210.160.141 port 35208 ssh2
Sep 30 18:02:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:36.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:36 compute-1 sshd-session[261805]: Connection closed by authenticating user root 192.210.160.141 port 35208 [preauth]
Sep 30 18:02:37 compute-1 ceph-mon[75484]: pgmap v705: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:37 compute-1 sshd-session[261832]: Failed password for invalid user laravel from 107.172.146.104 port 44450 ssh2
Sep 30 18:02:37 compute-1 sshd-session[261832]: Received disconnect from 107.172.146.104 port 44450:11: Bye Bye [preauth]
Sep 30 18:02:37 compute-1 sshd-session[261832]: Disconnected from invalid user laravel 107.172.146.104 port 44450 [preauth]
Sep 30 18:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:02:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:38.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:38.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:39 compute-1 ceph-mon[75484]: pgmap v706: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:40 compute-1 ceph-mon[75484]: pgmap v707: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:40.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:40.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:40 compute-1 sshd-session[261839]: Invalid user old from 175.126.165.170 port 40610
Sep 30 18:02:40 compute-1 sshd-session[261839]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:02:40 compute-1 sshd-session[261839]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:02:41 compute-1 sudo[261843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:02:41 compute-1 sudo[261843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:41 compute-1 sudo[261843]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:41 compute-1 sudo[261868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:02:41 compute-1 sudo[261868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:41 compute-1 sudo[261868]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:42.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:42.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:42 compute-1 podman[261925]: 2025-09-30 18:02:42.559194568 +0000 UTC m=+0.093073378 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 18:02:42 compute-1 podman[261926]: 2025-09-30 18:02:42.572847077 +0000 UTC m=+0.113588642 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:02:42 compute-1 podman[261968]: 2025-09-30 18:02:42.677854937 +0000 UTC m=+0.079508001 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:02:42 compute-1 sshd-session[261839]: Failed password for invalid user old from 175.126.165.170 port 40610 ssh2
Sep 30 18:02:42 compute-1 ceph-mon[75484]: pgmap v708: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:44.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:44 compute-1 sshd-session[261839]: Received disconnect from 175.126.165.170 port 40610:11: Bye Bye [preauth]
Sep 30 18:02:44 compute-1 sshd-session[261839]: Disconnected from invalid user old 175.126.165.170 port 40610 [preauth]
Sep 30 18:02:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:44.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:44 compute-1 ceph-mon[75484]: pgmap v709: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:02:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:46 compute-1 sudo[261993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:02:46 compute-1 sudo[261993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:46 compute-1 sudo[261993]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:46.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:46.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:46 compute-1 ceph-mon[75484]: pgmap v710: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:48.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:48.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:48 compute-1 ceph-mon[75484]: pgmap v711: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: ERROR   18:02:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: ERROR   18:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: ERROR   18:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: ERROR   18:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: ERROR   18:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:02:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:50.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:50.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:50 compute-1 sudo[262022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:02:50 compute-1 sudo[262022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:02:50 compute-1 sudo[262022]: pam_unix(sudo:session): session closed for user root
Sep 30 18:02:50 compute-1 ceph-mon[75484]: pgmap v712: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900091b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:52.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:52.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:52 compute-1 ceph-mon[75484]: pgmap v713: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640039c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:02:54.315 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:02:54.315 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:02:54.315 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:02:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:54.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:54.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:54 compute-1 ceph-mon[75484]: pgmap v714: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900091b0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:02:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:56.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:02:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:56.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:56 compute-1 podman[262055]: 2025-09-30 18:02:56.606942003 +0000 UTC m=+0.145823224 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Sep 30 18:02:56 compute-1 ceph-mon[75484]: pgmap v715: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.967606) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255377967715, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2349, "num_deletes": 251, "total_data_size": 6138690, "memory_usage": 6196592, "flush_reason": "Manual Compaction"}
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255377991077, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 3963846, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20880, "largest_seqno": 23224, "table_properties": {"data_size": 3954413, "index_size": 5927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19243, "raw_average_key_size": 20, "raw_value_size": 3935509, "raw_average_value_size": 4108, "num_data_blocks": 264, "num_entries": 958, "num_filter_entries": 958, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255161, "oldest_key_time": 1759255161, "file_creation_time": 1759255377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 23533 microseconds, and 12717 cpu microseconds.
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.991166) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 3963846 bytes OK
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.991202) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.993506) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.993530) EVENT_LOG_v1 {"time_micros": 1759255377993522, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.993575) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6128284, prev total WAL file size 6128284, number of live WAL files 2.
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.996529) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3870KB)], [39(11MB)]
Sep 30 18:02:57 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255377996823, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 15544656, "oldest_snapshot_seqno": -1}
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5213 keys, 13470384 bytes, temperature: kUnknown
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255378083122, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 13470384, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13433004, "index_size": 23254, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13061, "raw_key_size": 130827, "raw_average_key_size": 25, "raw_value_size": 13335947, "raw_average_value_size": 2558, "num_data_blocks": 969, "num_entries": 5213, "num_filter_entries": 5213, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.083466) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 13470384 bytes
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.085031) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.0 rd, 156.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 11.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 5731, records dropped: 518 output_compression: NoCompression
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.085063) EVENT_LOG_v1 {"time_micros": 1759255378085048, "job": 22, "event": "compaction_finished", "compaction_time_micros": 86360, "compaction_time_cpu_micros": 49838, "output_level": 6, "num_output_files": 1, "total_output_size": 13470384, "num_input_records": 5731, "num_output_records": 5213, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255378086686, "job": 22, "event": "table_file_deletion", "file_number": 41}
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255378090609, "job": 22, "event": "table_file_deletion", "file_number": 39}
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:57.996358) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.090771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.090777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.090780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.090782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:02:58.090784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:02:58 compute-1 sshd-session[262054]: Invalid user ftpuser from 192.210.160.141 port 58698
Sep 30 18:02:58 compute-1 sshd-session[262054]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:02:58 compute-1 sshd-session[262054]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:02:58 compute-1 podman[262086]: 2025-09-30 18:02:58.431078946 +0000 UTC m=+0.084994459 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:02:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:02:58.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:02:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:02:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:02:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:02:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:02:58 compute-1 ceph-mon[75484]: pgmap v716: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:02:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:02:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:02:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:00 compute-1 sshd-session[262054]: Failed password for invalid user ftpuser from 192.210.160.141 port 58698 ssh2
Sep 30 18:03:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:00.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:00.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:00 compute-1 ceph-mon[75484]: pgmap v717: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:01 compute-1 sshd-session[262054]: Connection closed by invalid user ftpuser 192.210.160.141 port 58698 [preauth]
Sep 30 18:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:02.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:02.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:02 compute-1 sshd-session[262114]: Invalid user ty from 194.107.115.65 port 29442
Sep 30 18:03:02 compute-1 sshd-session[262114]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:02 compute-1 sshd-session[262114]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:03:03 compute-1 ceph-mon[75484]: pgmap v718: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:04.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:04.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:04 compute-1 sshd-session[262114]: Failed password for invalid user ty from 194.107.115.65 port 29442 ssh2
Sep 30 18:03:05 compute-1 ceph-mon[75484]: pgmap v719: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:05 compute-1 sshd-session[262114]: Received disconnect from 194.107.115.65 port 29442:11: Bye Bye [preauth]
Sep 30 18:03:05 compute-1 sshd-session[262114]: Disconnected from invalid user ty 194.107.115.65 port 29442 [preauth]
Sep 30 18:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0008d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:05 compute-1 podman[262122]: 2025-09-30 18:03:05.534009254 +0000 UTC m=+0.078132054 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:03:05 compute-1 podman[249638]: time="2025-09-30T18:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:03:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:03:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8745 "" "Go-http-client/1.1"
Sep 30 18:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:06 compute-1 sudo[262142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:03:06 compute-1 sudo[262142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:06 compute-1 sudo[262142]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:06.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:06.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:07 compute-1 ceph-mon[75484]: pgmap v720: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:03:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:08.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:09 compute-1 ceph-mon[75484]: pgmap v721: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0019d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.455191) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390455315, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 361, "num_deletes": 250, "total_data_size": 315768, "memory_usage": 323072, "flush_reason": "Manual Compaction"}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390459466, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 207045, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23229, "largest_seqno": 23585, "table_properties": {"data_size": 204880, "index_size": 329, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5649, "raw_average_key_size": 19, "raw_value_size": 200636, "raw_average_value_size": 687, "num_data_blocks": 15, "num_entries": 292, "num_filter_entries": 292, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255378, "oldest_key_time": 1759255378, "file_creation_time": 1759255390, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4315 microseconds, and 1932 cpu microseconds.
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.459518) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 207045 bytes OK
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.459538) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.461751) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.461771) EVENT_LOG_v1 {"time_micros": 1759255390461765, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.461792) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 313339, prev total WAL file size 313339, number of live WAL files 2.
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.462313) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353032' seq:72057594037927935, type:22 .. '6D67727374617400373533' seq:0, type:0; will stop at (end)
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(202KB)], [42(12MB)]
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390462379, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 13677429, "oldest_snapshot_seqno": -1}
Sep 30 18:03:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:10.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 4997 keys, 9855971 bytes, temperature: kUnknown
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390515867, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9855971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9824499, "index_size": 17858, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126756, "raw_average_key_size": 25, "raw_value_size": 9735658, "raw_average_value_size": 1948, "num_data_blocks": 733, "num_entries": 4997, "num_filter_entries": 4997, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255390, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.516235) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9855971 bytes
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.517944) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 255.1 rd, 183.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.8 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(113.7) write-amplify(47.6) OK, records in: 5505, records dropped: 508 output_compression: NoCompression
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.518003) EVENT_LOG_v1 {"time_micros": 1759255390517960, "job": 24, "event": "compaction_finished", "compaction_time_micros": 53620, "compaction_time_cpu_micros": 23921, "output_level": 6, "num_output_files": 1, "total_output_size": 9855971, "num_input_records": 5505, "num_output_records": 4997, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390518220, "job": 24, "event": "table_file_deletion", "file_number": 44}
Sep 30 18:03:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255390523427, "job": 24, "event": "table_file_deletion", "file_number": 42}
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.462222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.523518) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.523545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.523549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.523551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:10.523554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:10.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:11 compute-1 ceph-mon[75484]: pgmap v722: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5580016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:12.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:12.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:13 compute-1 ceph-mon[75484]: pgmap v723: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0019d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:13 compute-1 podman[262176]: 2025-09-30 18:03:13.526199449 +0000 UTC m=+0.070761595 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:03:13 compute-1 podman[262175]: 2025-09-30 18:03:13.537729091 +0000 UTC m=+0.077284491 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 18:03:13 compute-1 podman[262177]: 2025-09-30 18:03:13.558354959 +0000 UTC m=+0.094595730 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:14 compute-1 ceph-mon[75484]: pgmap v724: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:14.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:14.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:15 compute-1 unix_chkpwd[262241]: password check failed for user (root)
Sep 30 18:03:15 compute-1 sshd-session[262236]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:15 compute-1 sshd-session[262172]: Invalid user titu from 113.249.93.94 port 45660
Sep 30 18:03:15 compute-1 sshd-session[262172]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:15 compute-1 sshd-session[262172]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94
Sep 30 18:03:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:16.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:16.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:16 compute-1 ceph-mon[75484]: pgmap v725: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.034 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 sshd-session[262236]: Failed password for root from 103.153.190.105 port 58716 ssh2
Sep 30 18:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0019d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.547 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.548 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.548 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.548 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.549 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.549 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:03:17 compute-1 nova_compute[238822]: 2025-09-30 18:03:17.549 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:17 compute-1 sshd-session[262172]: Failed password for invalid user titu from 113.249.93.94 port 45660 ssh2
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.065 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.065 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.065 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.066 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.066 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:03:18 compute-1 sshd-session[262236]: Received disconnect from 103.153.190.105 port 58716:11: Bye Bye [preauth]
Sep 30 18:03:18 compute-1 sshd-session[262236]: Disconnected from authenticating user root 103.153.190.105 port 58716 [preauth]
Sep 30 18:03:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:18.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:03:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4228144495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:18.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.571 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:03:18 compute-1 sshd-session[262244]: Invalid user scpuser from 216.10.242.161 port 54306
Sep 30 18:03:18 compute-1 sshd-session[262244]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:18 compute-1 sshd-session[262244]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.788 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.790 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.819 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.820 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5202MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.820 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:03:18 compute-1 nova_compute[238822]: 2025-09-30 18:03:18.821 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:03:18 compute-1 ceph-mon[75484]: pgmap v726: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4228144495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:19 compute-1 sshd-session[262172]: Received disconnect from 113.249.93.94 port 45660:11: Bye Bye [preauth]
Sep 30 18:03:19 compute-1 sshd-session[262172]: Disconnected from invalid user titu 113.249.93.94 port 45660 [preauth]
Sep 30 18:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: ERROR   18:03:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: ERROR   18:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: ERROR   18:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: ERROR   18:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: ERROR   18:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:03:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:19 compute-1 nova_compute[238822]: 2025-09-30 18:03:19.872 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:03:19 compute-1 nova_compute[238822]: 2025-09-30 18:03:19.873 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:03:18 up  3:40,  0 user,  load average: 0.97, 1.53, 1.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:03:19 compute-1 nova_compute[238822]: 2025-09-30 18:03:19.888 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:03:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1538901174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:20 compute-1 nova_compute[238822]: 2025-09-30 18:03:20.359 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:03:20 compute-1 nova_compute[238822]: 2025-09-30 18:03:20.367 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:03:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:20.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:20.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:20 compute-1 ceph-mon[75484]: pgmap v727: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1538901174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:20 compute-1 nova_compute[238822]: 2025-09-30 18:03:20.877 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:03:21 compute-1 sshd-session[262244]: Failed password for invalid user scpuser from 216.10.242.161 port 54306 ssh2
Sep 30 18:03:21 compute-1 sshd-session[262272]: Invalid user pt from 14.225.167.110 port 44320
Sep 30 18:03:21 compute-1 sshd-session[262272]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:21 compute-1 sshd-session[262272]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:03:21 compute-1 nova_compute[238822]: 2025-09-30 18:03:21.390 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:03:21 compute-1 nova_compute[238822]: 2025-09-30 18:03:21.390 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.570s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:03:21 compute-1 nova_compute[238822]: 2025-09-30 18:03:21.409 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:21 compute-1 nova_compute[238822]: 2025-09-30 18:03:21.410 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:22 compute-1 unix_chkpwd[262301]: password check failed for user (root)
Sep 30 18:03:22 compute-1 sshd-session[262297]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:03:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:22.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:22 compute-1 sshd-session[262244]: Received disconnect from 216.10.242.161 port 54306:11: Bye Bye [preauth]
Sep 30 18:03:22 compute-1 sshd-session[262244]: Disconnected from invalid user scpuser 216.10.242.161 port 54306 [preauth]
Sep 30 18:03:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:22.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:22 compute-1 sshd-session[262272]: Failed password for invalid user pt from 14.225.167.110 port 44320 ssh2
Sep 30 18:03:22 compute-1 ceph-mon[75484]: pgmap v728: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:23 compute-1 sshd-session[262297]: Failed password for root from 192.210.160.141 port 44172 ssh2
Sep 30 18:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/756163064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:24 compute-1 sshd-session[262272]: Received disconnect from 14.225.167.110 port 44320:11: Bye Bye [preauth]
Sep 30 18:03:24 compute-1 sshd-session[262272]: Disconnected from invalid user pt 14.225.167.110 port 44320 [preauth]
Sep 30 18:03:24 compute-1 sshd-session[262304]: Invalid user sol from 45.148.10.240 port 33954
Sep 30 18:03:24 compute-1 sshd-session[262304]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:24 compute-1 sshd-session[262304]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.148.10.240
Sep 30 18:03:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:24.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:24.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:24 compute-1 ceph-mon[75484]: pgmap v729: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3459063351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:03:25 compute-1 sshd-session[262297]: Connection closed by authenticating user root 192.210.160.141 port 44172 [preauth]
Sep 30 18:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:26 compute-1 sshd-session[262304]: Failed password for invalid user sol from 45.148.10.240 port 33954 ssh2
Sep 30 18:03:26 compute-1 sudo[262308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:03:26 compute-1 sudo[262308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:26 compute-1 sudo[262308]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:26.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:26 compute-1 sshd-session[262304]: Connection closed by invalid user sol 45.148.10.240 port 33954 [preauth]
Sep 30 18:03:27 compute-1 ceph-mon[75484]: pgmap v730: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:27 compute-1 podman[262334]: 2025-09-30 18:03:27.586566947 +0000 UTC m=+0.134063747 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Sep 30 18:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:28.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:28.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:29 compute-1 ceph-mon[75484]: pgmap v731: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002ad0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:29 compute-1 podman[262361]: 2025-09-30 18:03:29.517062657 +0000 UTC m=+0.065487123 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:30.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:30.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:31 compute-1 ceph-mon[75484]: pgmap v732: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5580032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5580032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:32 compute-1 unix_chkpwd[262392]: password check failed for user (root)
Sep 30 18:03:32 compute-1 sshd-session[262388]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:03:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:32.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:32.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:33 compute-1 ceph-mon[75484]: pgmap v733: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5580032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:33 compute-1 sshd-session[262388]: Failed password for root from 107.172.146.104 port 34648 ssh2
Sep 30 18:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:34.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:34.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:34 compute-1 sshd-session[262388]: Received disconnect from 107.172.146.104 port 34648:11: Bye Bye [preauth]
Sep 30 18:03:34 compute-1 sshd-session[262388]: Disconnected from authenticating user root 107.172.146.104 port 34648 [preauth]
Sep 30 18:03:35 compute-1 ceph-mon[75484]: pgmap v734: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:36.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:36 compute-1 podman[262397]: 2025-09-30 18:03:36.578166933 +0000 UTC m=+0.109139693 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:03:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:36.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:37 compute-1 ceph-mon[75484]: pgmap v735: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1978917613' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:03:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1978917613' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5580032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590009ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:03:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:38.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:38 compute-1 sshd-session[262417]: Invalid user azureuser from 84.51.43.58 port 64172
Sep 30 18:03:38 compute-1 sshd-session[262417]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:38 compute-1 sshd-session[262417]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:03:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:38.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:39 compute-1 ceph-mon[75484]: pgmap v736: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564002140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:40 compute-1 ceph-mon[75484]: pgmap v737: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:40 compute-1 sshd-session[262417]: Failed password for invalid user azureuser from 84.51.43.58 port 64172 ssh2
Sep 30 18:03:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:40.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:40.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:40 compute-1 sshd-session[262417]: Received disconnect from 84.51.43.58 port 64172:11: Bye Bye [preauth]
Sep 30 18:03:40 compute-1 sshd-session[262417]: Disconnected from invalid user azureuser 84.51.43.58 port 64172 [preauth]
Sep 30 18:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:42.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:42.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:42 compute-1 ceph-mon[75484]: pgmap v738: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564002140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:44.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:44 compute-1 podman[262426]: 2025-09-30 18:03:44.547744587 +0000 UTC m=+0.080057446 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:03:44 compute-1 podman[262428]: 2025-09-30 18:03:44.573016211 +0000 UTC m=+0.101590599 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:03:44 compute-1 podman[262427]: 2025-09-30 18:03:44.580638967 +0000 UTC m=+0.106932833 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6)
Sep 30 18:03:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:44.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:44 compute-1 ceph-mon[75484]: pgmap v739: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:46 compute-1 unix_chkpwd[262488]: password check failed for user (root)
Sep 30 18:03:46 compute-1 sshd-session[262444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:03:46 compute-1 sudo[262489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:03:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:46.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:46 compute-1 sudo[262489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:46 compute-1 sudo[262489]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:46.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:47 compute-1 ceph-mon[75484]: pgmap v740: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564002140 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:48 compute-1 sshd-session[262444]: Failed password for root from 192.210.160.141 port 51928 ssh2
Sep 30 18:03:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:48.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:48.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:49 compute-1 ceph-mon[75484]: pgmap v741: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:49 compute-1 sshd-session[262444]: Connection closed by authenticating user root 192.210.160.141 port 51928 [preauth]
Sep 30 18:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:50 compute-1 ceph-mon[75484]: pgmap v742: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:50.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:50 compute-1 sudo[262519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:03:50 compute-1 sudo[262519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:50 compute-1 sudo[262519]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:50 compute-1 sudo[262544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:03:50 compute-1 sudo[262544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640033e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:51 compute-1 sudo[262544]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:03:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:52 compute-1 ceph-mon[75484]: pgmap v743: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:03:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:52.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:03:54.317 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:03:54.318 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:03:54.318 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:03:54 compute-1 sshd-session[262601]: Invalid user agent from 175.126.165.170 port 42650
Sep 30 18:03:54 compute-1 sshd-session[262601]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:03:54 compute-1 sshd-session[262601]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:03:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:54.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:54.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:54 compute-1 ceph-mon[75484]: pgmap v744: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640033e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.470264) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435470341, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 680, "num_deletes": 256, "total_data_size": 1122938, "memory_usage": 1140168, "flush_reason": "Manual Compaction"}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435478804, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 737998, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23590, "largest_seqno": 24265, "table_properties": {"data_size": 734763, "index_size": 1143, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7347, "raw_average_key_size": 18, "raw_value_size": 728110, "raw_average_value_size": 1788, "num_data_blocks": 53, "num_entries": 407, "num_filter_entries": 407, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255391, "oldest_key_time": 1759255391, "file_creation_time": 1759255435, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 8604 microseconds, and 5073 cpu microseconds.
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.478874) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 737998 bytes OK
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.478902) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.480782) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.480804) EVENT_LOG_v1 {"time_micros": 1759255435480797, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.480825) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1119206, prev total WAL file size 1119206, number of live WAL files 2.
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.481690) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(720KB)], [45(9624KB)]
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435481785, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 10593969, "oldest_snapshot_seqno": -1}
Sep 30 18:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4881 keys, 10472435 bytes, temperature: kUnknown
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435546403, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 10472435, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10440646, "index_size": 18495, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 125518, "raw_average_key_size": 25, "raw_value_size": 10352726, "raw_average_value_size": 2121, "num_data_blocks": 756, "num_entries": 4881, "num_filter_entries": 4881, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255435, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.546785) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 10472435 bytes
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.548346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.7 rd, 161.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.4 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(28.5) write-amplify(14.2) OK, records in: 5404, records dropped: 523 output_compression: NoCompression
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.548376) EVENT_LOG_v1 {"time_micros": 1759255435548362, "job": 26, "event": "compaction_finished", "compaction_time_micros": 64712, "compaction_time_cpu_micros": 44224, "output_level": 6, "num_output_files": 1, "total_output_size": 10472435, "num_input_records": 5404, "num_output_records": 4881, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435548875, "job": 26, "event": "table_file_deletion", "file_number": 47}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255435552942, "job": 26, "event": "table_file_deletion", "file_number": 45}
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.481549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.553097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.553104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.553106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.553108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:03:55.553110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:56 compute-1 sshd-session[262601]: Failed password for invalid user agent from 175.126.165.170 port 42650 ssh2
Sep 30 18:03:56 compute-1 sudo[262607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:03:56 compute-1 sudo[262607]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:03:56 compute-1 sudo[262607]: pam_unix(sudo:session): session closed for user root
Sep 30 18:03:56 compute-1 ceph-mon[75484]: pgmap v745: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:03:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:03:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:03:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:56.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:03:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:03:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:56.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:03:57 compute-1 sshd-session[262601]: Received disconnect from 175.126.165.170 port 42650:11: Bye Bye [preauth]
Sep 30 18:03:57 compute-1 sshd-session[262601]: Disconnected from invalid user agent 175.126.165.170 port 42650 [preauth]
Sep 30 18:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:03:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/888285193' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:03:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:03:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/888285193' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/888285193' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/888285193' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:03:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:03:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:03:58.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:03:58 compute-1 ceph-mon[75484]: pgmap v746: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:03:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:58 compute-1 podman[262634]: 2025-09-30 18:03:58.577284832 +0000 UTC m=+0.121470837 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:03:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:03:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:03:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:03:58.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640033e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:03:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:03:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:03:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:00.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:00 compute-1 podman[262663]: 2025-09-30 18:04:00.52218479 +0000 UTC m=+0.072403839 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:04:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:00.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:00 compute-1 ceph-mon[75484]: pgmap v747: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558004000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:02.308 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:04:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:02.309 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:04:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:02.311 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:04:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:02.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:02.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:02 compute-1 ceph-mon[75484]: pgmap v748: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640033e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:04 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:04.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:04.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:04 compute-1 ceph-mon[75484]: pgmap v749: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:04:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:05 compute-1 podman[249638]: time="2025-09-30T18:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:04:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:04:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8757 "" "Go-http-client/1.1"
Sep 30 18:04:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:06 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:04:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:06.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:04:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:06 compute-1 sudo[262697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:04:06 compute-1 sudo[262697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:04:06 compute-1 sudo[262697]: pam_unix(sudo:session): session closed for user root
Sep 30 18:04:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:06.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:06 compute-1 podman[262721]: 2025-09-30 18:04:06.70636496 +0000 UTC m=+0.067851556 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:04:06 compute-1 ceph-mon[75484]: pgmap v750: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:08 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0015e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:08.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:08.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:04:08 compute-1 ceph-mon[75484]: pgmap v751: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:09 compute-1 unix_chkpwd[262748]: password check failed for user (root)
Sep 30 18:04:09 compute-1 sshd-session[262745]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:04:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:10 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:10.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:10.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:10 compute-1 unix_chkpwd[262751]: password check failed for user (root)
Sep 30 18:04:10 compute-1 sshd-session[262744]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:04:10 compute-1 ceph-mon[75484]: pgmap v752: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:11 compute-1 sshd-session[262745]: Failed password for root from 194.107.115.65 port 53914 ssh2
Sep 30 18:04:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:12 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0015e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:12 compute-1 sshd-session[262688]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:04:12 compute-1 sshd-session[262688]: banner exchange: Connection from 110.42.70.108 port 34602: Connection timed out
Sep 30 18:04:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.132 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:19:4a 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-54e2622d-59e1-4c7e-bc2b-c69118964b19', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54e2622d-59e1-4c7e-bc2b-c69118964b19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e2dde567e5c4b1c9802c64cfc281b6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7703b222-e0b2-4b5b-938b-dfe8edf1b48b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dd3e3f69-f69e-4073-bbf8-546ab98d1097) old=Port_Binding(mac=['fa:16:3e:8f:19:4a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-54e2622d-59e1-4c7e-bc2b-c69118964b19', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54e2622d-59e1-4c7e-bc2b-c69118964b19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e2dde567e5c4b1c9802c64cfc281b6d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:04:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.133 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dd3e3f69-f69e-4073-bbf8-546ab98d1097 in datapath 54e2622d-59e1-4c7e-bc2b-c69118964b19 updated
Sep 30 18:04:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.135 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54e2622d-59e1-4c7e-bc2b-c69118964b19, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:04:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.135 144543 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpk8h8yacv/privsep.sock']
Sep 30 18:04:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:12.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:12 compute-1 sshd-session[262745]: Received disconnect from 194.107.115.65 port 53914:11: Bye Bye [preauth]
Sep 30 18:04:12 compute-1 sshd-session[262745]: Disconnected from authenticating user root 194.107.115.65 port 53914 [preauth]
Sep 30 18:04:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:12.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:12 compute-1 sshd-session[262744]: Failed password for root from 192.210.160.141 port 37970 ssh2
Sep 30 18:04:13 compute-1 ceph-mon[75484]: pgmap v753: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.012 144543 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.013 144543 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpk8h8yacv/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.851 262759 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.858 262759 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.863 262759 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:12.865 262759 INFO oslo.privsep.daemon [-] privsep daemon running as pid 262759
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.015 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[45342913-f138-4aae-87f5-50cb1a5c2f16]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:04:13 compute-1 nova_compute[238822]: 2025-09-30 18:04:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:13 compute-1 nova_compute[238822]: 2025-09-30 18:04:13.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:04:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:13 compute-1 nova_compute[238822]: 2025-09-30 18:04:13.566 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:04:13 compute-1 nova_compute[238822]: 2025-09-30 18:04:13.568 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:13 compute-1 nova_compute[238822]: 2025-09-30 18:04:13.569 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.947 262759 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.947 262759 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:04:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:13.947 262759 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:04:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:14 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:14 compute-1 nova_compute[238822]: 2025-09-30 18:04:14.076 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:14 compute-1 sshd-session[262744]: Connection closed by authenticating user root 192.210.160.141 port 37970 [preauth]
Sep 30 18:04:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:14.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:14.630 262759 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 18:04:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:14.642 262759 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 18:04:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:14.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:14.722 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d45583-f392-421c-9f8e-0f86bb9c7ff4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:04:15 compute-1 ceph-mon[75484]: pgmap v754: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:15 compute-1 podman[262766]: 2025-09-30 18:04:15.543189111 +0000 UTC m=+0.077991310 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 18:04:15 compute-1 nova_compute[238822]: 2025-09-30 18:04:15.584 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:15 compute-1 nova_compute[238822]: 2025-09-30 18:04:15.584 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:15 compute-1 podman[262768]: 2025-09-30 18:04:15.586868432 +0000 UTC m=+0.101641960 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 18:04:15 compute-1 podman[262767]: 2025-09-30 18:04:15.588079825 +0000 UTC m=+0.106724697 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, version=9.6)
Sep 30 18:04:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:16 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0015e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:16 compute-1 nova_compute[238822]: 2025-09-30 18:04:16.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:16 compute-1 nova_compute[238822]: 2025-09-30 18:04:16.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:16 compute-1 nova_compute[238822]: 2025-09-30 18:04:16.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:04:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:16.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:16.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:17 compute-1 ceph-mon[75484]: pgmap v755: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002690 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:04:17 compute-1 sshd[170789]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 261662
Sep 30 18:04:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:18 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:18 compute-1 nova_compute[238822]: 2025-09-30 18:04:18.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:18 compute-1 nova_compute[238822]: 2025-09-30 18:04:18.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:18.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:18.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:19 compute-1 ceph-mon[75484]: pgmap v756: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:04:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1529259135' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: ERROR   18:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: ERROR   18:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: ERROR   18:04:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: ERROR   18:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: ERROR   18:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:04:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:04:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:04:19 compute-1 nova_compute[238822]: 2025-09-30 18:04:19.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:04:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:04:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3575243734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.081 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:04:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3575243734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.343 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.345 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.373 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.374 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5051MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.375 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:04:20 compute-1 nova_compute[238822]: 2025-09-30 18:04:20.375 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:04:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:20.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:04:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:04:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:20.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:21 compute-1 ceph-mon[75484]: pgmap v757: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:04:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3199141060' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:21 compute-1 nova_compute[238822]: 2025-09-30 18:04:21.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:04:21 compute-1 nova_compute[238822]: 2025-09-30 18:04:21.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:04:20 up  3:41,  0 user,  load average: 0.54, 1.29, 1.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:04:21 compute-1 nova_compute[238822]: 2025-09-30 18:04:21.723 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:04:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:22 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:04:22 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/577583544' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:22 compute-1 ceph-mon[75484]: pgmap v758: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:04:22 compute-1 nova_compute[238822]: 2025-09-30 18:04:22.165 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:04:22 compute-1 nova_compute[238822]: 2025-09-30 18:04:22.175 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:04:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:22.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:22 compute-1 nova_compute[238822]: 2025-09-30 18:04:22.682 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:04:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:22.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:22 compute-1 sshd-session[262856]: Invalid user seekcy from 216.10.242.161 port 41620
Sep 30 18:04:22 compute-1 sshd-session[262856]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:04:22 compute-1 sshd-session[262856]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:04:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/577583544' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:04:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:04:23 compute-1 nova_compute[238822]: 2025-09-30 18:04:23.193 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:04:23 compute-1 nova_compute[238822]: 2025-09-30 18:04:23.193 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.818s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:04:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:04:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:24 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:24 compute-1 ceph-mon[75484]: pgmap v759: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:04:24 compute-1 nova_compute[238822]: 2025-09-30 18:04:24.194 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:04:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:24.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:24.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:24 compute-1 sshd-session[262856]: Failed password for invalid user seekcy from 216.10.242.161 port 41620 ssh2
Sep 30 18:04:25 compute-1 sshd-session[262856]: Received disconnect from 216.10.242.161 port 41620:11: Bye Bye [preauth]
Sep 30 18:04:25 compute-1 sshd-session[262856]: Disconnected from invalid user seekcy 216.10.242.161 port 41620 [preauth]
Sep 30 18:04:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:26 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:26.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:26.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:26 compute-1 sudo[262885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:04:26 compute-1 sudo[262885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:04:26 compute-1 sudo[262885]: pam_unix(sudo:session): session closed for user root
Sep 30 18:04:26 compute-1 ceph-mon[75484]: pgmap v760: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:04:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c001da0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:27 compute-1 sshd-session[262912]: Invalid user admin from 78.128.112.74 port 41798
Sep 30 18:04:27 compute-1 sshd-session[262912]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:04:27 compute-1 sshd-session[262912]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=78.128.112.74
Sep 30 18:04:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:28 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:28 compute-1 sshd-session[262910]: Invalid user open from 14.225.167.110 port 54164
Sep 30 18:04:28 compute-1 sshd-session[262910]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:04:28 compute-1 sshd-session[262910]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:04:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:28.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:28.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:28 compute-1 unix_chkpwd[262918]: password check failed for user (root)
Sep 30 18:04:28 compute-1 sshd-session[262915]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:04:28 compute-1 sshd-session[262828]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:04:28 compute-1 sshd-session[262828]: banner exchange: Connection from 113.249.93.94 port 60112: Connection timed out
Sep 30 18:04:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180428 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:04:28 compute-1 ceph-mon[75484]: pgmap v761: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:04:29 compute-1 sshd-session[262912]: Failed password for invalid user admin from 78.128.112.74 port 41798 ssh2
Sep 30 18:04:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:29 compute-1 podman[262919]: 2025-09-30 18:04:29.618828152 +0000 UTC m=+0.151897089 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 18:04:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:30 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:30 compute-1 sshd-session[262910]: Failed password for invalid user open from 14.225.167.110 port 54164 ssh2
Sep 30 18:04:30 compute-1 sshd-session[262915]: Failed password for root from 107.172.146.104 port 45964 ssh2
Sep 30 18:04:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:30 compute-1 sshd-session[262910]: Received disconnect from 14.225.167.110 port 54164:11: Bye Bye [preauth]
Sep 30 18:04:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:30.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:30 compute-1 sshd-session[262910]: Disconnected from invalid user open 14.225.167.110 port 54164 [preauth]
Sep 30 18:04:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:30.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:30 compute-1 ceph-mon[75484]: pgmap v762: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:04:30 compute-1 sshd-session[262912]: Connection closed by invalid user admin 78.128.112.74 port 41798 [preauth]
Sep 30 18:04:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:31 compute-1 sshd-session[262915]: Received disconnect from 107.172.146.104 port 45964:11: Bye Bye [preauth]
Sep 30 18:04:31 compute-1 sshd-session[262915]: Disconnected from authenticating user root 107.172.146.104 port 45964 [preauth]
Sep 30 18:04:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:31 compute-1 podman[262948]: 2025-09-30 18:04:31.554373568 +0000 UTC m=+0.095249697 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:04:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:32 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a800 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:32.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:32.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:32 compute-1 ceph-mon[75484]: pgmap v763: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:04:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:34 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:04:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:34.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:04:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:34.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:34 compute-1 sshd-session[262975]: Invalid user ubuntu from 192.210.160.141 port 48978
Sep 30 18:04:35 compute-1 sshd-session[262975]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:04:35 compute-1 sshd-session[262975]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:04:35 compute-1 ceph-mon[75484]: pgmap v764: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:04:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:35 compute-1 podman[249638]: time="2025-09-30T18:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:04:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:04:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8766 "" "Go-http-client/1.1"
Sep 30 18:04:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:36 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:36.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:36.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:37 compute-1 ceph-mon[75484]: pgmap v765: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:04:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/502183397' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:04:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/502183397' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:04:37 compute-1 sshd-session[262975]: Failed password for invalid user ubuntu from 192.210.160.141 port 48978 ssh2
Sep 30 18:04:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001bd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:37 compute-1 podman[262984]: 2025-09-30 18:04:37.539246878 +0000 UTC m=+0.080740865 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:04:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:38 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001bd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:04:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:38.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:38.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:38 compute-1 sshd-session[262975]: Connection closed by invalid user ubuntu 192.210.160.141 port 48978 [preauth]
Sep 30 18:04:39 compute-1 ceph-mon[75484]: pgmap v766: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:04:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:40 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:40 compute-1 ceph-mon[75484]: pgmap v767: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:04:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:40.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:40.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:42 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:42.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:42.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:42 compute-1 ceph-mon[75484]: pgmap v768: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:44 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:44.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:44.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:44 compute-1 ceph-mon[75484]: pgmap v769: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:46 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:46 compute-1 podman[263013]: 2025-09-30 18:04:46.560191926 +0000 UTC m=+0.096013347 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:04:46 compute-1 podman[263015]: 2025-09-30 18:04:46.565753237 +0000 UTC m=+0.102264807 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:04:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:46.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:46 compute-1 podman[263014]: 2025-09-30 18:04:46.582857419 +0000 UTC m=+0.111321091 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, maintainer=Red Hat, Inc.)
Sep 30 18:04:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:46.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:46 compute-1 sudo[263076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:04:46 compute-1 sudo[263076]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:04:46 compute-1 sudo[263076]: pam_unix(sudo:session): session closed for user root
Sep 30 18:04:46 compute-1 ceph-mon[75484]: pgmap v770: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:48 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:48.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:48 compute-1 ceph-mon[75484]: pgmap v771: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:04:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - - [30/Sep/2025:18:04:49.281 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.19" - latency=0.003000081s
Sep 30 18:04:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - - [30/Sep/2025:18:04:49.294 +0000] "GET /swift/healthcheck HTTP/1.1" 200 0 - "python-urllib3/1.26.19" - latency=0.001000027s
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: ERROR   18:04:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: ERROR   18:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: ERROR   18:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: ERROR   18:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: ERROR   18:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:04:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:04:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:50 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:50.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:50 compute-1 ceph-mon[75484]: pgmap v772: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:04:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:52 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:52.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:52.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:52 compute-1 unix_chkpwd[263109]: password check failed for user (openvswitch)
Sep 30 18:04:52 compute-1 sshd-session[263107]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239  user=openvswitch
Sep 30 18:04:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e130 e130: 2 total, 2 up, 2 in
Sep 30 18:04:52 compute-1 ceph-mon[75484]: pgmap v773: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:04:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:04:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:53 compute-1 ceph-mon[75484]: osdmap e130: 2 total, 2 up, 2 in
Sep 30 18:04:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e131 e131: 2 total, 2 up, 2 in
Sep 30 18:04:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:54 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:54.324 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:04:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:54.333 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.010s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:04:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:04:54.334 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:04:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:54.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:54.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:54 compute-1 sshd-session[263107]: Failed password for openvswitch from 167.71.248.239 port 52088 ssh2
Sep 30 18:04:54 compute-1 sshd-session[263107]: Connection closed by authenticating user openvswitch 167.71.248.239 port 52088 [preauth]
Sep 30 18:04:55 compute-1 ceph-mon[75484]: pgmap v775: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 204 B/s rd, 102 B/s wr, 0 op/s
Sep 30 18:04:55 compute-1 ceph-mon[75484]: osdmap e131: 2 total, 2 up, 2 in
Sep 30 18:04:55 compute-1 sshd-session[263111]: Invalid user app from 84.51.43.58 port 37741
Sep 30 18:04:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e132 e132: 2 total, 2 up, 2 in
Sep 30 18:04:55 compute-1 sshd-session[263111]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:04:55 compute-1 sshd-session[263111]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:04:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558001bd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:56 compute-1 ceph-mon[75484]: osdmap e132: 2 total, 2 up, 2 in
Sep 30 18:04:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:56 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:56 compute-1 sudo[263117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:04:56 compute-1 sudo[263117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:04:56 compute-1 sudo[263117]: pam_unix(sudo:session): session closed for user root
Sep 30 18:04:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:56 compute-1 sudo[263142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:04:56 compute-1 sudo[263142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:04:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:56.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:04:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:56.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:04:56 compute-1 sshd-session[263111]: Failed password for invalid user app from 84.51.43.58 port 37741 ssh2
Sep 30 18:04:57 compute-1 ceph-mon[75484]: pgmap v778: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 170 B/s wr, 0 op/s
Sep 30 18:04:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e133 e133: 2 total, 2 up, 2 in
Sep 30 18:04:57 compute-1 sshd-session[263111]: Received disconnect from 84.51.43.58 port 37741:11: Bye Bye [preauth]
Sep 30 18:04:57 compute-1 sshd-session[263111]: Disconnected from invalid user app 84.51.43.58 port 37741 [preauth]
Sep 30 18:04:57 compute-1 sudo[263142]: pam_unix(sudo:session): session closed for user root
Sep 30 18:04:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:58 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:58 compute-1 ceph-mon[75484]: osdmap e133: 2 total, 2 up, 2 in
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2190557213' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2190557213' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:04:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:04:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:04:58.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:04:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:04:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:04:58.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:04:59 compute-1 ceph-mon[75484]: pgmap v780: 353 pgs: 353 active+clean; 458 KiB data, 107 MiB used, 40 GiB / 40 GiB avail; 211 B/s wr, 0 op/s
Sep 30 18:04:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:04:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:04:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:04:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:04:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:04:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:00 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:00 compute-1 ceph-mon[75484]: pgmap v781: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 63 op/s
Sep 30 18:05:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 e134: 2 total, 2 up, 2 in
Sep 30 18:05:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:00.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:00 compute-1 podman[263206]: 2025-09-30 18:05:00.657516196 +0000 UTC m=+0.187406229 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:05:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:00.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:01 compute-1 unix_chkpwd[263235]: password check failed for user (root)
Sep 30 18:05:01 compute-1 sshd-session[263202]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:05:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:01 compute-1 ceph-mon[75484]: osdmap e134: 2 total, 2 up, 2 in
Sep 30 18:05:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:02 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:02 compute-1 ceph-mon[75484]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 39 KiB/s rd, 6.0 MiB/s wr, 56 op/s
Sep 30 18:05:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:05:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:05:02 compute-1 podman[263237]: 2025-09-30 18:05:02.542939777 +0000 UTC m=+0.079940133 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:05:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:02.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:02 compute-1 sudo[263261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:05:02 compute-1 sudo[263261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:05:02 compute-1 sudo[263261]: pam_unix(sudo:session): session closed for user root
Sep 30 18:05:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:03 compute-1 sshd-session[263202]: Failed password for root from 192.210.160.141 port 55914 ssh2
Sep 30 18:05:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:04 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5600037d0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:04 compute-1 sshd-session[263202]: Connection closed by authenticating user root 192.210.160.141 port 55914 [preauth]
Sep 30 18:05:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:04.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:04.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:04 compute-1 ceph-mon[75484]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Sep 30 18:05:05 compute-1 sshd-session[263114]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:05:05 compute-1 sshd-session[263114]: banner exchange: Connection from 14.103.129.43 port 48702: Connection timed out
Sep 30 18:05:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002e70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:05 compute-1 podman[249638]: time="2025-09-30T18:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:05:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:05:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8769 "" "Go-http-client/1.1"
Sep 30 18:05:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:06 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:06 compute-1 sshd-session[263292]: Invalid user user5 from 167.172.43.167 port 41180
Sep 30 18:05:06 compute-1 sshd-session[263292]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:06 compute-1 sshd-session[263292]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:05:06 compute-1 unix_chkpwd[263295]: password check failed for user (root)
Sep 30 18:05:06 compute-1 sshd-session[263289]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:05:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:06.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:06.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:06 compute-1 ceph-mon[75484]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 4.7 MiB/s wr, 43 op/s
Sep 30 18:05:06 compute-1 sudo[263297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:05:06 compute-1 sudo[263297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:05:06 compute-1 sudo[263297]: pam_unix(sudo:session): session closed for user root
Sep 30 18:05:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c0014d0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:05:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:08 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:08 compute-1 sshd-session[263292]: Failed password for invalid user user5 from 167.172.43.167 port 41180 ssh2
Sep 30 18:05:08 compute-1 sshd-session[263289]: Failed password for root from 175.126.165.170 port 34720 ssh2
Sep 30 18:05:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:08 compute-1 podman[263324]: 2025-09-30 18:05:08.564191929 +0000 UTC m=+0.099291496 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 18:05:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:08.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:08.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:08 compute-1 ceph-mon[75484]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 26 KiB/s rd, 4.1 MiB/s wr, 38 op/s
Sep 30 18:05:09 compute-1 sshd-session[263292]: Received disconnect from 167.172.43.167 port 41180:11: Bye Bye [preauth]
Sep 30 18:05:09 compute-1 sshd-session[263292]: Disconnected from invalid user user5 167.172.43.167 port 41180 [preauth]
Sep 30 18:05:09 compute-1 sshd-session[263289]: Received disconnect from 175.126.165.170 port 34720:11: Bye Bye [preauth]
Sep 30 18:05:09 compute-1 sshd-session[263289]: Disconnected from authenticating user root 175.126.165.170 port 34720 [preauth]
Sep 30 18:05:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa59000a1b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:10 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:10.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:10.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:10 compute-1 ceph-mon[75484]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 204 B/s rd, 0 op/s
Sep 30 18:05:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c0014d0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:12 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:12.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:12.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:12 compute-1 ceph-mon[75484]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 180 B/s rd, 0 op/s
Sep 30 18:05:13 compute-1 nova_compute[238822]: 2025-09-30 18:05:13.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:14 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:14.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:14.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:14 compute-1 ceph-mon[75484]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001670 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:16 compute-1 nova_compute[238822]: 2025-09-30 18:05:16.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:16 compute-1 nova_compute[238822]: 2025-09-30 18:05:16.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:16 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:16.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:16.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:16 compute-1 sshd-session[263350]: Invalid user localuser from 194.107.115.65 port 21880
Sep 30 18:05:16 compute-1 sshd-session[263350]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:16 compute-1 sshd-session[263350]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:05:16 compute-1 podman[263356]: 2025-09-30 18:05:16.949184359 +0000 UTC m=+0.112971986 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:05:16 compute-1 podman[263354]: 2025-09-30 18:05:16.951161222 +0000 UTC m=+0.129712689 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible)
Sep 30 18:05:16 compute-1 podman[263355]: 2025-09-30 18:05:16.955290124 +0000 UTC m=+0.122957496 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, distribution-scope=public)
Sep 30 18:05:16 compute-1 ceph-mon[75484]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:17 compute-1 nova_compute[238822]: 2025-09-30 18:05:17.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:17 compute-1 nova_compute[238822]: 2025-09-30 18:05:17.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:05:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590008dc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:18 compute-1 nova_compute[238822]: 2025-09-30 18:05:18.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:18 compute-1 nova_compute[238822]: 2025-09-30 18:05:18.055 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:18 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:18.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.004000108s ======
Sep 30 18:05:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:18.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000108s
Sep 30 18:05:18 compute-1 sshd-session[263350]: Failed password for invalid user localuser from 194.107.115.65 port 21880 ssh2
Sep 30 18:05:19 compute-1 ceph-mon[75484]: pgmap v791: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: ERROR   18:05:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: ERROR   18:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: ERROR   18:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: ERROR   18:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: ERROR   18:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:05:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:05:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c002050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:19 compute-1 sshd-session[263350]: Received disconnect from 194.107.115.65 port 21880:11: Bye Bye [preauth]
Sep 30 18:05:19 compute-1 sshd-session[263350]: Disconnected from invalid user localuser 194.107.115.65 port 21880 [preauth]
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:05:19 compute-1 nova_compute[238822]: 2025-09-30 18:05:19.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:05:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2351808412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:05:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3501653568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.141 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.567s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.421 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.423 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.449 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.450 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5060MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.450 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:05:20 compute-1 nova_compute[238822]: 2025-09-30 18:05:20.451 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:05:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:20.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:21 compute-1 ceph-mon[75484]: pgmap v792: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:05:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3501653568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590008dc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.540 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.541 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:05:20 up  3:42,  0 user,  load average: 0.37, 1.10, 1.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:05:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.621 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.669 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.670 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.689 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.719 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:05:21 compute-1 nova_compute[238822]: 2025-09-30 18:05:21.736 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:05:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/910263476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:22 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:05:22 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/31460186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:22 compute-1 nova_compute[238822]: 2025-09-30 18:05:22.245 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:05:22 compute-1 nova_compute[238822]: 2025-09-30 18:05:22.252 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:05:22 compute-1 unix_chkpwd[263467]: password check failed for user (root)
Sep 30 18:05:22 compute-1 sshd-session[263465]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:05:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:22.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:22 compute-1 nova_compute[238822]: 2025-09-30 18:05:22.763 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:05:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:22.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:23 compute-1 ceph-mon[75484]: pgmap v793: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/31460186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:05:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:05:23 compute-1 nova_compute[238822]: 2025-09-30 18:05:23.277 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:05:23 compute-1 nova_compute[238822]: 2025-09-30 18:05:23.277 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.826s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:05:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c002050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:23 compute-1 sshd-session[263468]: Invalid user pt from 216.10.242.161 port 57258
Sep 30 18:05:23 compute-1 sshd-session[263468]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:23 compute-1 sshd-session[263468]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:05:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:24 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:24 compute-1 sshd-session[263465]: Failed password for root from 107.172.146.104 port 43790 ssh2
Sep 30 18:05:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:05:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:24.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:05:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:24.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:24 compute-1 unix_chkpwd[263475]: password check failed for user (root)
Sep 30 18:05:24 compute-1 sshd-session[263471]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:05:25 compute-1 ceph-mon[75484]: pgmap v794: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:25 compute-1 sshd-session[263465]: Received disconnect from 107.172.146.104 port 43790:11: Bye Bye [preauth]
Sep 30 18:05:25 compute-1 sshd-session[263465]: Disconnected from authenticating user root 107.172.146.104 port 43790 [preauth]
Sep 30 18:05:25 compute-1 nova_compute[238822]: 2025-09-30 18:05:25.277 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:05:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:25 compute-1 sshd-session[263468]: Failed password for invalid user pt from 216.10.242.161 port 57258 ssh2
Sep 30 18:05:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:26 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590008de0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:26 compute-1 sshd-session[263468]: Received disconnect from 216.10.242.161 port 57258:11: Bye Bye [preauth]
Sep 30 18:05:26 compute-1 sshd-session[263468]: Disconnected from invalid user pt 216.10.242.161 port 57258 [preauth]
Sep 30 18:05:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:26.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:26.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:27 compute-1 sshd-session[263471]: Failed password for root from 192.210.160.141 port 55392 ssh2
Sep 30 18:05:27 compute-1 sudo[263478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:05:27 compute-1 sudo[263478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:05:27 compute-1 sudo[263478]: pam_unix(sudo:session): session closed for user root
Sep 30 18:05:27 compute-1 ceph-mon[75484]: pgmap v795: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c002050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:27 compute-1 sshd-session[263471]: Connection closed by authenticating user root 192.210.160.141 port 55392 [preauth]
Sep 30 18:05:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:28 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:28.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:28.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:29 compute-1 ceph-mon[75484]: pgmap v796: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:30 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590000df0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:30.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:30.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:31 compute-1 ceph-mon[75484]: pgmap v797: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:05:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c0021f0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:31 compute-1 podman[263508]: 2025-09-30 18:05:31.586452691 +0000 UTC m=+0.125679680 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 18:05:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:32 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:32 compute-1 ceph-mon[75484]: pgmap v798: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:32.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:32.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:33 compute-1 podman[263536]: 2025-09-30 18:05:33.553211411 +0000 UTC m=+0.095954696 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:05:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:34 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590000df0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:34.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:34.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:34 compute-1 ceph-mon[75484]: pgmap v799: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:35 compute-1 sshd-session[263561]: Invalid user ftpuser from 14.225.167.110 port 49884
Sep 30 18:05:35 compute-1 sshd-session[263561]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:35 compute-1 sshd-session[263561]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:05:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c0021f0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:35 compute-1 podman[249638]: time="2025-09-30T18:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:05:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:05:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8777 "" "Go-http-client/1.1"
Sep 30 18:05:35 compute-1 sshd-session[263563]: Invalid user vas from 103.153.190.105 port 36678
Sep 30 18:05:35 compute-1 sshd-session[263563]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:35 compute-1 sshd-session[263563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:05:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:36 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:05:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4252940435' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:05:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:05:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4252940435' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:05:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:36.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:36.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:36 compute-1 ceph-mon[75484]: pgmap v800: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4252940435' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:05:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4252940435' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:05:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa564001d50 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:37 compute-1 sshd-session[263561]: Failed password for invalid user ftpuser from 14.225.167.110 port 49884 ssh2
Sep 30 18:05:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:05:37 compute-1 sshd-session[263563]: Failed password for invalid user vas from 103.153.190.105 port 36678 ssh2
Sep 30 18:05:38 compute-1 sshd-session[263503]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:05:38 compute-1 sshd-session[263503]: banner exchange: Connection from 113.249.93.94 port 10022: Connection timed out
Sep 30 18:05:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:38 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa590000df0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:38 compute-1 sshd-session[263563]: Received disconnect from 103.153.190.105 port 36678:11: Bye Bye [preauth]
Sep 30 18:05:38 compute-1 sshd-session[263563]: Disconnected from invalid user vas 103.153.190.105 port 36678 [preauth]
Sep 30 18:05:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:38.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:38 compute-1 sshd-session[263561]: Received disconnect from 14.225.167.110 port 49884:11: Bye Bye [preauth]
Sep 30 18:05:38 compute-1 sshd-session[263561]: Disconnected from invalid user ftpuser 14.225.167.110 port 49884 [preauth]
Sep 30 18:05:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:38.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:38 compute-1 ceph-mon[75484]: pgmap v801: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:39 compute-1 podman[263571]: 2025-09-30 18:05:39.56177886 +0000 UTC m=+0.102196036 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:05:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:40 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:40.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:41 compute-1 ceph-mon[75484]: pgmap v802: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:05:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:42 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:42.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:43 compute-1 ceph-mon[75484]: pgmap v803: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:44 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:44.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:44.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:45 compute-1 ceph-mon[75484]: pgmap v804: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:46 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:46.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:46.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:47 compute-1 ceph-mon[75484]: pgmap v805: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:47 compute-1 sudo[263601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:05:47 compute-1 sudo[263601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:05:47 compute-1 sudo[263601]: pam_unix(sudo:session): session closed for user root
Sep 30 18:05:47 compute-1 podman[263625]: 2025-09-30 18:05:47.29857523 +0000 UTC m=+0.081770583 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:05:47 compute-1 podman[263626]: 2025-09-30 18:05:47.323991487 +0000 UTC m=+0.096532272 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Sep 30 18:05:47 compute-1 podman[263627]: 2025-09-30 18:05:47.3289074 +0000 UTC m=+0.094047164 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:05:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:48 compute-1 sshd-session[263598]: Invalid user guest from 192.210.160.141 port 48870
Sep 30 18:05:48 compute-1 sshd-session[263598]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:05:48 compute-1 sshd-session[263598]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:05:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:48 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:48.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:48.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:49 compute-1 ceph-mon[75484]: pgmap v806: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: ERROR   18:05:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: ERROR   18:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: ERROR   18:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: ERROR   18:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: ERROR   18:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:05:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:05:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:50 compute-1 sshd-session[263598]: Failed password for invalid user guest from 192.210.160.141 port 48870 ssh2
Sep 30 18:05:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:50 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:50.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:50.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:51 compute-1 ceph-mon[75484]: pgmap v807: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:05:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:52 compute-1 sshd-session[263598]: Connection closed by invalid user guest 192.210.160.141 port 48870 [preauth]
Sep 30 18:05:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:52 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:52.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:52.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:53 compute-1 ceph-mon[75484]: pgmap v808: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:05:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:54 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:05:54.337 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:05:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:05:54.339 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:05:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:05:54.339 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:05:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:54.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:54.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:55 compute-1 ceph-mon[75484]: pgmap v809: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:56 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:56 compute-1 ceph-mon[75484]: pgmap v810: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:05:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:56.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:05:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:56.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2696418884' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:05:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2696418884' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:05:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:58 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:05:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:58 compute-1 ceph-mon[75484]: pgmap v811: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:05:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:05:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:05:58.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:05:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:05:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:05:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:05:58.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:05:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:05:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:05:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:05:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:05:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:05:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:00 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:00.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:00.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:00 compute-1 ceph-mon[75484]: pgmap v812: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:06:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:02 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:02 compute-1 podman[263704]: 2025-09-30 18:06:02.586732804 +0000 UTC m=+0.126699288 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Sep 30 18:06:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:02.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:02 compute-1 sudo[263732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:06:02 compute-1 sudo[263732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:02 compute-1 sudo[263732]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:02 compute-1 sudo[263757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:06:02 compute-1 sudo[263757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:02.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:02 compute-1 ceph-mon[75484]: pgmap v813: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:03 compute-1 sudo[263757]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa560001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:06:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:06:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:04 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c0026e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:04 compute-1 podman[263814]: 2025-09-30 18:06:04.523798881 +0000 UTC m=+0.069685376 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:06:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:04.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:04.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:04 compute-1 ceph-mon[75484]: pgmap v814: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:05 compute-1 podman[249638]: time="2025-09-30T18:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:06:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:06:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8764 "" "Go-http-client/1.1"
Sep 30 18:06:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:06 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:06.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:06.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:06 compute-1 ceph-mon[75484]: pgmap v815: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:07 compute-1 sudo[263841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:06:07 compute-1 sudo[263841]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:07 compute-1 sudo[263841]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:06:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:08 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:08 compute-1 sudo[263869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:06:08 compute-1 sudo[263869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:08 compute-1 sudo[263869]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:08.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:08.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:09 compute-1 ceph-mon[75484]: pgmap v816: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:06:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:06:09 compute-1 sshd-session[263867]: Invalid user ldap from 84.51.43.58 port 53446
Sep 30 18:06:09 compute-1 sshd-session[263867]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:06:09 compute-1 sshd-session[263867]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:06:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:10 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:10 compute-1 podman[263897]: 2025-09-30 18:06:10.56153191 +0000 UTC m=+0.094183539 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:06:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:10.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:10.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:11 compute-1 ceph-mon[75484]: pgmap v817: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:06:11 compute-1 sshd-session[263867]: Failed password for invalid user ldap from 84.51.43.58 port 53446 ssh2
Sep 30 18:06:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900096e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:12 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:12.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:12.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:13 compute-1 unix_chkpwd[263923]: password check failed for user (root)
Sep 30 18:06:13 compute-1 sshd-session[263918]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:06:13 compute-1 ceph-mon[75484]: pgmap v818: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:13 compute-1 sshd-session[263867]: Received disconnect from 84.51.43.58 port 53446:11: Bye Bye [preauth]
Sep 30 18:06:13 compute-1 sshd-session[263867]: Disconnected from invalid user ldap 84.51.43.58 port 53446 [preauth]
Sep 30 18:06:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:14 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:14.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:14.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:15 compute-1 ceph-mon[75484]: pgmap v819: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:15 compute-1 sshd-session[263918]: Failed password for root from 192.210.160.141 port 51620 ssh2
Sep 30 18:06:15 compute-1 unix_chkpwd[263928]: password check failed for user (root)
Sep 30 18:06:15 compute-1 sshd-session[263925]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:06:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900096e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:16 compute-1 nova_compute[238822]: 2025-09-30 18:06:16.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:16 compute-1 sshd-session[263918]: Connection closed by authenticating user root 192.210.160.141 port 51620 [preauth]
Sep 30 18:06:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:16 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:16.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:17 compute-1 sshd-session[263925]: Failed password for root from 175.126.165.170 port 48710 ssh2
Sep 30 18:06:17 compute-1 nova_compute[238822]: 2025-09-30 18:06:17.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:17 compute-1 nova_compute[238822]: 2025-09-30 18:06:17.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:17 compute-1 nova_compute[238822]: 2025-09-30 18:06:17.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:06:17 compute-1 ceph-mon[75484]: pgmap v820: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:17 compute-1 podman[263931]: 2025-09-30 18:06:17.567124653 +0000 UTC m=+0.101389063 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:06:17 compute-1 podman[263933]: 2025-09-30 18:06:17.576223899 +0000 UTC m=+0.097530269 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:06:17 compute-1 podman[263932]: 2025-09-30 18:06:17.588553163 +0000 UTC m=+0.113743198 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:06:17 compute-1 sshd-session[263992]: Invalid user dani from 107.172.146.104 port 58088
Sep 30 18:06:17 compute-1 sshd-session[263992]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:06:17 compute-1 sshd-session[263992]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:06:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:18 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:18 compute-1 sshd-session[263925]: Received disconnect from 175.126.165.170 port 48710:11: Bye Bye [preauth]
Sep 30 18:06:18 compute-1 sshd-session[263925]: Disconnected from authenticating user root 175.126.165.170 port 48710 [preauth]
Sep 30 18:06:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:18.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:18.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:19 compute-1 nova_compute[238822]: 2025-09-30 18:06:19.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:19 compute-1 nova_compute[238822]: 2025-09-30 18:06:19.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:19 compute-1 ceph-mon[75484]: pgmap v821: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: ERROR   18:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: ERROR   18:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: ERROR   18:06:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: ERROR   18:06:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: ERROR   18:06:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:06:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:06:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:19 compute-1 sshd-session[263992]: Failed password for invalid user dani from 107.172.146.104 port 58088 ssh2
Sep 30 18:06:19 compute-1 sshd-session[263992]: Received disconnect from 107.172.146.104 port 58088:11: Bye Bye [preauth]
Sep 30 18:06:19 compute-1 sshd-session[263992]: Disconnected from invalid user dani 107.172.146.104 port 58088 [preauth]
Sep 30 18:06:20 compute-1 nova_compute[238822]: 2025-09-30 18:06:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4195776977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900096e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:20.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:20.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:21 compute-1 ceph-mon[75484]: pgmap v822: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:06:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1238874090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:06:21 compute-1 nova_compute[238822]: 2025-09-30 18:06:21.576 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:06:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:06:22 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1712915588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.095 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:06:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1712915588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:22 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.276 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.277 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:06:22 compute-1 unix_chkpwd[264024]: password check failed for user (root)
Sep 30 18:06:22 compute-1 sshd-session[263998]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.301 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.302 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5052MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.302 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:06:22 compute-1 nova_compute[238822]: 2025-09-30 18:06:22.302 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:06:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:22.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:22.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:23 compute-1 ceph-mon[75484]: pgmap v823: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:06:23 compute-1 nova_compute[238822]: 2025-09-30 18:06:23.349 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:06:23 compute-1 nova_compute[238822]: 2025-09-30 18:06:23.350 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:06:22 up  3:43,  0 user,  load average: 0.12, 0.88, 1.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:06:23 compute-1 nova_compute[238822]: 2025-09-30 18:06:23.365 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:06:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:06:23 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2948922853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:23 compute-1 nova_compute[238822]: 2025-09-30 18:06:23.839 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:06:23 compute-1 nova_compute[238822]: 2025-09-30 18:06:23.847 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:06:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:24 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2948922853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:06:24 compute-1 ceph-mon[75484]: pgmap v824: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:24 compute-1 nova_compute[238822]: 2025-09-30 18:06:24.357 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:06:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:24 compute-1 sshd-session[263998]: Failed password for root from 194.107.115.65 port 46352 ssh2
Sep 30 18:06:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:24.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:24 compute-1 nova_compute[238822]: 2025-09-30 18:06:24.871 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:06:24 compute-1 nova_compute[238822]: 2025-09-30 18:06:24.872 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.570s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:06:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:24.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:25 compute-1 sshd-session[263998]: Received disconnect from 194.107.115.65 port 46352:11: Bye Bye [preauth]
Sep 30 18:06:25 compute-1 sshd-session[263998]: Disconnected from authenticating user root 194.107.115.65 port 46352 [preauth]
Sep 30 18:06:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:25 compute-1 nova_compute[238822]: 2025-09-30 18:06:25.873 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:06:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:26 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:26 compute-1 sshd-session[264050]: Invalid user seekcy from 216.10.242.161 port 60772
Sep 30 18:06:26 compute-1 sshd-session[264050]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:06:26 compute-1 sshd-session[264050]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:06:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:26.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:26.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:26 compute-1 ceph-mon[75484]: pgmap v825: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:27 compute-1 sudo[264054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:06:27 compute-1 sudo[264054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:27 compute-1 sudo[264054]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.948487) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255587948559, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1815, "num_deletes": 251, "total_data_size": 4344657, "memory_usage": 4403168, "flush_reason": "Manual Compaction"}
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255587967516, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2819351, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24270, "largest_seqno": 26080, "table_properties": {"data_size": 2811986, "index_size": 4308, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15738, "raw_average_key_size": 20, "raw_value_size": 2796952, "raw_average_value_size": 3572, "num_data_blocks": 193, "num_entries": 783, "num_filter_entries": 783, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255436, "oldest_key_time": 1759255436, "file_creation_time": 1759255587, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 19168 microseconds, and 10886 cpu microseconds.
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.967597) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2819351 bytes OK
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.967684) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.969697) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.969722) EVENT_LOG_v1 {"time_micros": 1759255587969714, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.969749) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 4336405, prev total WAL file size 4336405, number of live WAL files 2.
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.971825) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2753KB)], [48(10226KB)]
Sep 30 18:06:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255587971906, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 13291786, "oldest_snapshot_seqno": -1}
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5144 keys, 11231409 bytes, temperature: kUnknown
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255588062015, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 11231409, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11197326, "index_size": 20149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 131636, "raw_average_key_size": 25, "raw_value_size": 11104179, "raw_average_value_size": 2158, "num_data_blocks": 824, "num_entries": 5144, "num_filter_entries": 5144, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255587, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.062382) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 11231409 bytes
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.063778) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.3 rd, 124.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.0 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(8.7) write-amplify(4.0) OK, records in: 5664, records dropped: 520 output_compression: NoCompression
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.063811) EVENT_LOG_v1 {"time_micros": 1759255588063795, "job": 28, "event": "compaction_finished", "compaction_time_micros": 90218, "compaction_time_cpu_micros": 43531, "output_level": 6, "num_output_files": 1, "total_output_size": 11231409, "num_input_records": 5664, "num_output_records": 5144, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255588064983, "job": 28, "event": "table_file_deletion", "file_number": 50}
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255588068718, "job": 28, "event": "table_file_deletion", "file_number": 48}
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:27.971689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.068811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.068820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.068824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.068827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:06:28.068830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:06:28 compute-1 sshd-session[264050]: Failed password for invalid user seekcy from 216.10.242.161 port 60772 ssh2
Sep 30 18:06:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:28 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:28 compute-1 sshd-session[264050]: Received disconnect from 216.10.242.161 port 60772:11: Bye Bye [preauth]
Sep 30 18:06:28 compute-1 sshd-session[264050]: Disconnected from invalid user seekcy 216.10.242.161 port 60772 [preauth]
Sep 30 18:06:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:28.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:28 compute-1 ceph-mon[75484]: pgmap v826: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:30 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c004550 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:30.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:31 compute-1 ceph-mon[75484]: pgmap v827: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:06:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:32 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:32.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:33 compute-1 ceph-mon[75484]: pgmap v828: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:33 compute-1 podman[264087]: 2025-09-30 18:06:33.609524523 +0000 UTC m=+0.145963479 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:06:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:34 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588001110 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:34.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:34.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:35 compute-1 ceph-mon[75484]: pgmap v829: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:35 compute-1 podman[264117]: 2025-09-30 18:06:35.535739017 +0000 UTC m=+0.070889408 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:06:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:35 compute-1 podman[249638]: time="2025-09-30T18:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:06:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:06:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8775 "" "Go-http-client/1.1"
Sep 30 18:06:36 compute-1 unix_chkpwd[264143]: password check failed for user (root)
Sep 30 18:06:36 compute-1 sshd-session[264114]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:06:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:36 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:36.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:36.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:37 compute-1 ceph-mon[75484]: pgmap v830: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2465583842' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:06:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2465583842' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:06:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:37 compute-1 sshd-session[264114]: Failed password for root from 192.210.160.141 port 47924 ssh2
Sep 30 18:06:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:06:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:38 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588001110 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:38.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:38.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:39 compute-1 ceph-mon[75484]: pgmap v831: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:39 compute-1 sshd-session[264114]: Connection closed by authenticating user root 192.210.160.141 port 47924 [preauth]
Sep 30 18:06:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:40 compute-1 ceph-mon[75484]: pgmap v832: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:06:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:40 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:40.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:40.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:41 compute-1 podman[264151]: 2025-09-30 18:06:41.562767519 +0000 UTC m=+0.097589891 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 18:06:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:42 compute-1 unix_chkpwd[264171]: password check failed for user (root)
Sep 30 18:06:42 compute-1 sshd-session[264149]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:06:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:42 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5880022a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:42.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:42.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:42 compute-1 ceph-mon[75484]: pgmap v833: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:43 compute-1 sshd-session[264149]: Failed password for root from 14.225.167.110 port 45814 ssh2
Sep 30 18:06:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:44 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:44.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:44.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:44 compute-1 ceph-mon[75484]: pgmap v834: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:45 compute-1 sshd-session[264149]: Received disconnect from 14.225.167.110 port 45814:11: Bye Bye [preauth]
Sep 30 18:06:45 compute-1 sshd-session[264149]: Disconnected from authenticating user root 14.225.167.110 port 45814 [preauth]
Sep 30 18:06:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:46 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5880022a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:46.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:46.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:46 compute-1 ceph-mon[75484]: pgmap v835: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:47 compute-1 sudo[264177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:06:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:47 compute-1 sudo[264177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:06:47 compute-1 sudo[264177]: pam_unix(sudo:session): session closed for user root
Sep 30 18:06:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:48 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:48 compute-1 podman[264204]: 2025-09-30 18:06:48.521915147 +0000 UTC m=+0.057086065 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 18:06:48 compute-1 podman[264205]: 2025-09-30 18:06:48.532269197 +0000 UTC m=+0.061191896 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 18:06:48 compute-1 podman[264203]: 2025-09-30 18:06:48.537873039 +0000 UTC m=+0.069735747 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 18:06:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:48.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:48.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:48 compute-1 ceph-mon[75484]: pgmap v836: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:06:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180649 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: ERROR   18:06:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: ERROR   18:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: ERROR   18:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: ERROR   18:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: ERROR   18:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:06:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:06:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:50 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588002fb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:50.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:50 compute-1 ceph-mon[75484]: pgmap v837: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:06:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:52 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:52.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:52.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:53 compute-1 ceph-mon[75484]: pgmap v838: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:06:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:06:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:54 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588002fb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:06:54.341 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:06:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:06:54.341 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:06:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:06:54.341 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:06:54 compute-1 ceph-mon[75484]: pgmap v839: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:06:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:54.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:56 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:06:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:56.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:06:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:56.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:57 compute-1 ceph-mon[75484]: pgmap v840: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:06:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:57 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3339462164' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:06:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3339462164' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:06:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:58 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:06:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:58 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:06:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:06:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:06:58.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:06:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:06:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:06:58.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:06:59 compute-1 unix_chkpwd[264274]: password check failed for user (root)
Sep 30 18:06:59 compute-1 sshd-session[264270]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:06:59 compute-1 ceph-mon[75484]: pgmap v841: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:06:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:06:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:06:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:06:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:06:59 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:00 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:07:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:00.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:07:00 compute-1 sshd-session[264270]: Failed password for root from 192.210.160.141 port 53494 ssh2
Sep 30 18:07:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:00.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:01 compute-1 ceph-mon[75484]: pgmap v842: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:07:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:07:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:07:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:01 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558003c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:02 compute-1 sshd-session[264270]: Connection closed by authenticating user root 192.210.160.141 port 53494 [preauth]
Sep 30 18:07:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:02 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:02 compute-1 ceph-mon[75484]: pgmap v843: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:07:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:02.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:02.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:03 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:04 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:07:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:04 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043a0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:04 compute-1 podman[264282]: 2025-09-30 18:07:04.644109001 +0000 UTC m=+0.179420061 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 18:07:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:04.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:04 compute-1 ceph-mon[75484]: pgmap v844: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:07:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:04.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:05 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:05 compute-1 podman[249638]: time="2025-09-30T18:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:07:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:07:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8770 "" "Go-http-client/1.1"
Sep 30 18:07:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:06 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:06 compute-1 podman[264311]: 2025-09-30 18:07:06.550975759 +0000 UTC m=+0.088896725 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:07:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:06.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:06 compute-1 ceph-mon[75484]: pgmap v845: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:07:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:06.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:07 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:07 compute-1 sudo[264337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:07:07 compute-1 sudo[264337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:07 compute-1 sudo[264337]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:07:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:08 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:08 compute-1 sudo[264363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:07:08 compute-1 sudo[264363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:08 compute-1 sudo[264363]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:08 compute-1 sudo[264389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:07:08 compute-1 sudo[264389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:08.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:08.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:09 compute-1 ceph-mon[75484]: pgmap v846: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:07:09 compute-1 sudo[264389]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:09 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:07:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:07:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:10 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:10.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:10.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:11 compute-1 ceph-mon[75484]: pgmap v847: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:07:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180711 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:07:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:11 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:11 compute-1 sshd-session[264448]: Invalid user nodeuser from 107.172.146.104 port 56916
Sep 30 18:07:11 compute-1 sshd-session[264448]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:11 compute-1 sshd-session[264448]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:07:12 compute-1 podman[264450]: 2025-09-30 18:07:12.06798121 +0000 UTC m=+0.090674212 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 18:07:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:12 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:12.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:13 compute-1 ceph-mon[75484]: pgmap v848: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:07:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:13 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:13 compute-1 sshd-session[264448]: Failed password for invalid user nodeuser from 107.172.146.104 port 56916 ssh2
Sep 30 18:07:14 compute-1 sshd-session[264448]: Received disconnect from 107.172.146.104 port 56916:11: Bye Bye [preauth]
Sep 30 18:07:14 compute-1 sshd-session[264448]: Disconnected from invalid user nodeuser 107.172.146.104 port 56916 [preauth]
Sep 30 18:07:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:14 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:14 compute-1 sudo[264472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:07:14 compute-1 sudo[264472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:14 compute-1 sudo[264472]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:14.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:14.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:15 compute-1 ceph-mon[75484]: pgmap v849: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:07:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:07:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:07:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:15 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:16 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:16.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:16.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:17 compute-1 nova_compute[238822]: 2025-09-30 18:07:17.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:17 compute-1 ceph-mon[75484]: pgmap v850: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:07:17 compute-1 nova_compute[238822]: 2025-09-30 18:07:17.568 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:17 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:18 compute-1 nova_compute[238822]: 2025-09-30 18:07:18.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:18 compute-1 nova_compute[238822]: 2025-09-30 18:07:18.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:18 compute-1 nova_compute[238822]: 2025-09-30 18:07:18.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:07:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:18 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:18.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:18.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:19 compute-1 ceph-mon[75484]: pgmap v851: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: ERROR   18:07:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: ERROR   18:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: ERROR   18:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: ERROR   18:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: ERROR   18:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:07:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:07:19 compute-1 podman[264503]: 2025-09-30 18:07:19.547963198 +0000 UTC m=+0.081488234 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Sep 30 18:07:19 compute-1 podman[264502]: 2025-09-30 18:07:19.566041247 +0000 UTC m=+0.102618075 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Sep 30 18:07:19 compute-1 podman[264504]: 2025-09-30 18:07:19.565998266 +0000 UTC m=+0.095299448 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:07:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:19 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:20 compute-1 nova_compute[238822]: 2025-09-30 18:07:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1560331497' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:20 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:20.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:20.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:21 compute-1 nova_compute[238822]: 2025-09-30 18:07:21.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:21 compute-1 ceph-mon[75484]: pgmap v852: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:07:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:21 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:22 compute-1 nova_compute[238822]: 2025-09-30 18:07:22.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3632844503' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:22 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:22 compute-1 sshd-session[264563]: Invalid user user from 192.210.160.141 port 54614
Sep 30 18:07:22 compute-1 sshd-session[264563]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:22 compute-1 sshd-session[264563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:07:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:22.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:22 compute-1 sshd-session[264565]: Invalid user mamy from 84.51.43.58 port 34796
Sep 30 18:07:22 compute-1 sshd-session[264565]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:22 compute-1 sshd-session[264565]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:07:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:23.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:23 compute-1 ceph-mon[75484]: pgmap v853: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:07:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:07:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:07:23 compute-1 nova_compute[238822]: 2025-09-30 18:07:23.573 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:07:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:23 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:23 compute-1 unix_chkpwd[264591]: password check failed for user (root)
Sep 30 18:07:23 compute-1 sshd-session[264568]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:07:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:07:24 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2960321938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.082 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:07:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2960321938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:24 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.303 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.305 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.337 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.338 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5062MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.338 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:07:24 compute-1 nova_compute[238822]: 2025-09-30 18:07:24.339 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:07:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:24.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:24 compute-1 sshd-session[264563]: Failed password for invalid user user from 192.210.160.141 port 54614 ssh2
Sep 30 18:07:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:25.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:25 compute-1 sshd-session[264565]: Failed password for invalid user mamy from 84.51.43.58 port 34796 ssh2
Sep 30 18:07:25 compute-1 ceph-mon[75484]: pgmap v854: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:25 compute-1 sshd-session[264565]: Received disconnect from 84.51.43.58 port 34796:11: Bye Bye [preauth]
Sep 30 18:07:25 compute-1 sshd-session[264565]: Disconnected from invalid user mamy 84.51.43.58 port 34796 [preauth]
Sep 30 18:07:25 compute-1 nova_compute[238822]: 2025-09-30 18:07:25.400 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:07:25 compute-1 nova_compute[238822]: 2025-09-30 18:07:25.400 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:07:24 up  3:44,  0 user,  load average: 0.04, 0.72, 1.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:07:25 compute-1 nova_compute[238822]: 2025-09-30 18:07:25.422 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:07:25 compute-1 sshd-session[264568]: Failed password for root from 175.126.165.170 port 39584 ssh2
Sep 30 18:07:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:25 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa588003cc0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:07:25 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1941274574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:25 compute-1 nova_compute[238822]: 2025-09-30 18:07:25.943 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:07:25 compute-1 nova_compute[238822]: 2025-09-30 18:07:25.950 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:07:26 compute-1 ceph-mon[75484]: pgmap v855: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1941274574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:07:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:26 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:26 compute-1 nova_compute[238822]: 2025-09-30 18:07:26.468 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:07:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:26 compute-1 sshd-session[264568]: Received disconnect from 175.126.165.170 port 39584:11: Bye Bye [preauth]
Sep 30 18:07:26 compute-1 sshd-session[264568]: Disconnected from authenticating user root 175.126.165.170 port 39584 [preauth]
Sep 30 18:07:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:26.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:26 compute-1 nova_compute[238822]: 2025-09-30 18:07:26.978 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:07:26 compute-1 nova_compute[238822]: 2025-09-30 18:07:26.979 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.640s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:07:26 compute-1 sshd-session[264563]: Connection closed by invalid user user 192.210.160.141 port 54614 [preauth]
Sep 30 18:07:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:27.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:27 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:27 compute-1 sudo[264621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:07:27 compute-1 sudo[264621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:27 compute-1 sudo[264621]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:27 compute-1 nova_compute[238822]: 2025-09-30 18:07:27.980 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:07:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:28 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:28.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:28 compute-1 ceph-mon[75484]: pgmap v856: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:07:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:07:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:29 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:29 compute-1 sshd-session[264648]: Invalid user student3 from 194.107.115.65 port 14322
Sep 30 18:07:29 compute-1 sshd-session[264648]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:29 compute-1 sshd-session[264648]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:07:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:30 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:30.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:30 compute-1 ceph-mon[75484]: pgmap v857: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:07:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:31.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:31 compute-1 unix_chkpwd[264657]: password check failed for user (root)
Sep 30 18:07:31 compute-1 sshd-session[264653]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:07:31 compute-1 sshd-session[264650]: Invalid user developer from 14.103.129.43 port 54044
Sep 30 18:07:31 compute-1 sshd-session[264650]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:31 compute-1 sshd-session[264650]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.103.129.43
Sep 30 18:07:31 compute-1 sshd-session[264648]: Failed password for invalid user student3 from 194.107.115.65 port 14322 ssh2
Sep 30 18:07:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:31 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:31 compute-1 sshd-session[264648]: Received disconnect from 194.107.115.65 port 14322:11: Bye Bye [preauth]
Sep 30 18:07:31 compute-1 sshd-session[264648]: Disconnected from invalid user student3 194.107.115.65 port 14322 [preauth]
Sep 30 18:07:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:32 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa57c002c70 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:32.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:33 compute-1 ceph-mon[75484]: pgmap v858: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:33.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:33 compute-1 sshd-session[264653]: Failed password for root from 216.10.242.161 port 37748 ssh2
Sep 30 18:07:33 compute-1 sshd-session[264650]: Failed password for invalid user developer from 14.103.129.43 port 54044 ssh2
Sep 30 18:07:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:33 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:34 compute-1 sshd-session[264653]: Received disconnect from 216.10.242.161 port 37748:11: Bye Bye [preauth]
Sep 30 18:07:34 compute-1 sshd-session[264653]: Disconnected from authenticating user root 216.10.242.161 port 37748 [preauth]
Sep 30 18:07:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:34 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5640043c0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:34.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:35.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:35 compute-1 ceph-mon[75484]: pgmap v859: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:35 compute-1 podman[264663]: 2025-09-30 18:07:35.581667229 +0000 UTC m=+0.127323593 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:07:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:35 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:35 compute-1 podman[249638]: time="2025-09-30T18:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:07:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:07:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8776 "" "Go-http-client/1.1"
Sep 30 18:07:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:36 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:36.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:37.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:37 compute-1 ceph-mon[75484]: pgmap v860: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3205241886' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:07:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3205241886' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:07:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:37.443 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:07:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:37.444 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:07:37 compute-1 podman[264692]: 2025-09-30 18:07:37.55744885 +0000 UTC m=+0.093742196 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:07:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:37 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:07:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:38 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:38.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:38.967 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:82:19 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8540cf85-00d6-4dc4-a235-89cd0b224d26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8540cf85-00d6-4dc4-a235-89cd0b224d26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5ecd2ee32c3491198baea5df005e7e5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63eb3a66-bb05-42df-9005-47612d7f10be, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=049e58b6-619a-494f-be8e-cd60a8549d92) old=Port_Binding(mac=['fa:16:3e:5b:82:19'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8540cf85-00d6-4dc4-a235-89cd0b224d26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8540cf85-00d6-4dc4-a235-89cd0b224d26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5ecd2ee32c3491198baea5df005e7e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:07:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:38.968 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 049e58b6-619a-494f-be8e-cd60a8549d92 in datapath 8540cf85-00d6-4dc4-a235-89cd0b224d26 updated
Sep 30 18:07:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:38.969 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8540cf85-00d6-4dc4-a235-89cd0b224d26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:07:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:38.970 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8ccdc1-171a-4c8f-b066-a98e9bdbc9f2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:07:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:39.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:39 compute-1 ceph-mon[75484]: pgmap v861: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:39 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:40 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:40.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:41.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:41 compute-1 sshd-session[264656]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:07:41 compute-1 sshd-session[264656]: banner exchange: Connection from 110.42.70.108 port 42512: Connection timed out
Sep 30 18:07:41 compute-1 ceph-mon[75484]: pgmap v862: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:07:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:41 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:42 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:42 compute-1 podman[264723]: 2025-09-30 18:07:42.550131739 +0000 UTC m=+0.087954909 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Sep 30 18:07:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:42.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:43.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:43 compute-1 ceph-mon[75484]: pgmap v863: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:43 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:44 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:44 compute-1 ceph-mon[75484]: pgmap v864: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:44.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:45.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:45 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:46 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c001c60 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:46.446 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:07:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:46.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:46 compute-1 ceph-mon[75484]: pgmap v865: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:47.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:47 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:47 compute-1 sudo[264749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:07:47 compute-1 sudo[264749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:07:47 compute-1 sudo[264749]: pam_unix(sudo:session): session closed for user root
Sep 30 18:07:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:48 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5900011e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:48 compute-1 sshd-session[264690]: Connection closed by 101.126.25.120 port 47294 [preauth]
Sep 30 18:07:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:48.455 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:ea:a7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5f482ad5-36f0-4d8c-b057-4e11342b7a56', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f482ad5-36f0-4d8c-b057-4e11342b7a56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20644a86c59b4259a037c783fe6fff20', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80b03d95-2212-48b8-9026-d1d054714f2e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7811f356-f3dd-4a3d-aa8e-55d122820071) old=Port_Binding(mac=['fa:16:3e:f9:ea:a7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5f482ad5-36f0-4d8c-b057-4e11342b7a56', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f482ad5-36f0-4d8c-b057-4e11342b7a56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20644a86c59b4259a037c783fe6fff20', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:07:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:48.456 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7811f356-f3dd-4a3d-aa8e-55d122820071 in datapath 5f482ad5-36f0-4d8c-b057-4e11342b7a56 updated
Sep 30 18:07:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:48.458 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f482ad5-36f0-4d8c-b057-4e11342b7a56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:07:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:48.459 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d28839-93d7-4af0-abd1-d9fdc666da86]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:07:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:48.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:49 compute-1 ceph-mon[75484]: pgmap v866: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:49.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:49 compute-1 unix_chkpwd[264777]: password check failed for user (root)
Sep 30 18:07:49 compute-1 sshd-session[264747]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: ERROR   18:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: ERROR   18:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: ERROR   18:07:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: ERROR   18:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: ERROR   18:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:07:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:07:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:49 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:50 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003840 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:50 compute-1 podman[264779]: 2025-09-30 18:07:50.539147407 +0000 UTC m=+0.084810873 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:07:50 compute-1 podman[264780]: 2025-09-30 18:07:50.543357441 +0000 UTC m=+0.089767838 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:07:50 compute-1 podman[264781]: 2025-09-30 18:07:50.543842904 +0000 UTC m=+0.085881132 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 18:07:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:50.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:51.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:51 compute-1 ceph-mon[75484]: pgmap v867: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:07:51 compute-1 sshd-session[264747]: Failed password for root from 192.210.160.141 port 36992 ssh2
Sep 30 18:07:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:51 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:52 compute-1 sshd-session[264747]: Connection closed by authenticating user root 192.210.160.141 port 36992 [preauth]
Sep 30 18:07:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:52 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa558002af0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:52.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:53.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:53 compute-1 ceph-mon[75484]: pgmap v868: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:07:53 compute-1 sshd-session[264840]: Invalid user test from 14.225.167.110 port 43732
Sep 30 18:07:53 compute-1 sshd-session[264840]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:53 compute-1 sshd-session[264840]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:07:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:53 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa58800bc90 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:54 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003840 fd 14 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:07:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:54.343 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:07:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:54.344 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:07:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:07:54.344 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:07:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:54.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:55.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:55 compute-1 sshd-session[264840]: Failed password for invalid user test from 14.225.167.110 port 43732 ssh2
Sep 30 18:07:55 compute-1 ceph-mon[75484]: pgmap v869: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:55 compute-1 kernel: ganesha.nfsd[264688]: segfault at 50 ip 00007fa63b94432e sp 00007fa5f97f9210 error 4 in libntirpc.so.5.8[7fa63b929000+2c000] likely on CPU 7 (core 0, socket 7)
Sep 30 18:07:55 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:07:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[261227]: 30/09/2025 18:07:55 : epoch 68dc1b0b : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa56c003840 fd 14 proxy ignored for local
Sep 30 18:07:55 compute-1 systemd[1]: Started Process Core Dump (PID 264847/UID 0).
Sep 30 18:07:55 compute-1 sshd-session[264840]: Received disconnect from 14.225.167.110 port 43732:11: Bye Bye [preauth]
Sep 30 18:07:55 compute-1 sshd-session[264840]: Disconnected from invalid user test 14.225.167.110 port 43732 [preauth]
Sep 30 18:07:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:07:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:56.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:07:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:57 compute-1 ceph-mon[75484]: pgmap v870: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:57 compute-1 systemd-coredump[264848]: Process 261234 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 75:
                                                    #0  0x00007fa63b94432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:07:57 compute-1 sshd-session[264849]: Invalid user iptv from 103.153.190.105 port 45024
Sep 30 18:07:57 compute-1 sshd-session[264849]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:07:57 compute-1 sshd-session[264849]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:07:57 compute-1 systemd[1]: systemd-coredump@9-264847-0.service: Deactivated successfully.
Sep 30 18:07:57 compute-1 systemd[1]: systemd-coredump@9-264847-0.service: Consumed 1.524s CPU time.
Sep 30 18:07:57 compute-1 podman[264857]: 2025-09-30 18:07:57.326224883 +0000 UTC m=+0.034969097 container died 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 18:07:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-effca0547c373f0bf2290f0cdca7cf7751ca3a3e6b6e45e5e101d8918052ea5a-merged.mount: Deactivated successfully.
Sep 30 18:07:57 compute-1 podman[264857]: 2025-09-30 18:07:57.37605614 +0000 UTC m=+0.084800354 container remove 544d3e4cf8eee5d5b356b8db22e591fa2c81f7ff2fab8bc8af8a5d6d164b8de2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 18:07:57 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:07:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:57 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:07:57 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.596s CPU time.
Sep 30 18:07:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3752378017' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:07:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3752378017' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:07:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:07:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:07:58.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:07:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:07:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:07:59.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:07:59 compute-1 ceph-mon[75484]: pgmap v871: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:07:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:07:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:07:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:07:59 compute-1 sshd-session[264849]: Failed password for invalid user iptv from 103.153.190.105 port 45024 ssh2
Sep 30 18:07:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:07:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5213 writes, 26K keys, 5213 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.03 MB/s
                                           Cumulative WAL: 5213 writes, 5213 syncs, 1.00 writes per sync, written: 0.06 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1469 writes, 7117 keys, 1469 commit groups, 1.0 writes per commit group, ingest: 15.86 MB, 0.03 MB/s
                                           Interval WAL: 1469 writes, 1469 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    142.2      0.24              0.13        14    0.017       0      0       0.0       0.0
                                             L6      1/0   10.71 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   4.0    164.6    140.3      0.99              0.47        13    0.076     63K   6724       0.0       0.0
                                            Sum      1/0   10.71 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.0    132.0    140.7      1.23              0.60        27    0.046     63K   6724       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.2    139.6    141.0      0.46              0.24        10    0.046     27K   2587       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    164.6    140.3      0.99              0.47        13    0.076     63K   6724       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    143.4      0.24              0.13        13    0.019       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.034, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.17 GB write, 0.10 MB/s write, 0.16 GB read, 0.09 MB/s read, 1.2 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 13.41 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000243 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(769,12.92 MB,4.24871%) FilterBlock(27,177.80 KB,0.0571151%) IndexBlock(27,328.61 KB,0.105562%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:08:00 compute-1 ceph-mon[75484]: pgmap v872: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:08:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:00.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:01.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180801 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:08:01 compute-1 sshd-session[264849]: Received disconnect from 103.153.190.105 port 45024:11: Bye Bye [preauth]
Sep 30 18:08:01 compute-1 sshd-session[264849]: Disconnected from invalid user iptv 103.153.190.105 port 45024 [preauth]
Sep 30 18:08:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:02.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:02 compute-1 ceph-mon[75484]: pgmap v873: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:08:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:03.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:04.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:04 compute-1 ceph-mon[75484]: pgmap v874: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:08:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:08:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:05.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:08:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:05 compute-1 podman[249638]: time="2025-09-30T18:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:08:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:08:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8291 "" "Go-http-client/1.1"
Sep 30 18:08:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:06 compute-1 podman[264909]: 2025-09-30 18:08:06.626031505 +0000 UTC m=+0.162571526 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:08:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:06.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:07 compute-1 ceph-mon[75484]: pgmap v875: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:08:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:08:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:07.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:08:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:08:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:07 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 10.
Sep 30 18:08:07 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:08:07 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.596s CPU time.
Sep 30 18:08:07 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 18:08:07 compute-1 podman[264938]: 2025-09-30 18:08:07.809892608 +0000 UTC m=+0.095507553 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:08:07 compute-1 sshd-session[264956]: Invalid user minecraft from 107.172.146.104 port 49244
Sep 30 18:08:07 compute-1 sshd-session[264956]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:08:07 compute-1 sshd-session[264956]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:08:08 compute-1 sudo[265005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:08:08 compute-1 sudo[265005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:08 compute-1 sudo[265005]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:08 compute-1 podman[265029]: 2025-09-30 18:08:08.039311 +0000 UTC m=+0.058580024 container create 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 18:08:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:08:08 compute-1 podman[265029]: 2025-09-30 18:08:08.015089115 +0000 UTC m=+0.034358219 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f99c85b9498bd0ec52775d42ec52261284e6e56d3291be8e81bba9b520f0ba4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 18:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f99c85b9498bd0ec52775d42ec52261284e6e56d3291be8e81bba9b520f0ba4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f99c85b9498bd0ec52775d42ec52261284e6e56d3291be8e81bba9b520f0ba4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f99c85b9498bd0ec52775d42ec52261284e6e56d3291be8e81bba9b520f0ba4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:08:08 compute-1 podman[265029]: 2025-09-30 18:08:08.132804048 +0000 UTC m=+0.152073162 container init 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 18:08:08 compute-1 podman[265029]: 2025-09-30 18:08:08.145757898 +0000 UTC m=+0.165026962 container start 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Sep 30 18:08:08 compute-1 bash[265029]: 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c
Sep 30 18:08:08 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:08:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:08.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:09.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:09 compute-1 ceph-mon[75484]: pgmap v876: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:08:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:09 compute-1 sshd-session[264956]: Failed password for invalid user minecraft from 107.172.146.104 port 49244 ssh2
Sep 30 18:08:09 compute-1 sshd-session[264956]: Received disconnect from 107.172.146.104 port 49244:11: Bye Bye [preauth]
Sep 30 18:08:09 compute-1 sshd-session[264956]: Disconnected from invalid user minecraft 107.172.146.104 port 49244 [preauth]
Sep 30 18:08:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:10.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:11.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:11 compute-1 ceph-mon[75484]: pgmap v877: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:08:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:12.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:13.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:13 compute-1 ceph-mon[75484]: pgmap v878: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:08:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:13 compute-1 podman[265103]: 2025-09-30 18:08:13.559168359 +0000 UTC m=+0.093946621 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Sep 30 18:08:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:13 compute-1 unix_chkpwd[265123]: password check failed for user (root)
Sep 30 18:08:13 compute-1 sshd-session[265099]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:08:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:14 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:08:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:14 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:08:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:14 compute-1 sudo[265126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:08:14 compute-1 sudo[265126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:14 compute-1 sudo[265126]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:14.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:14 compute-1 sudo[265151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:08:14 compute-1 sudo[265151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:15.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:15 compute-1 ceph-mon[75484]: pgmap v879: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:08:15 compute-1 sudo[265151]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:15 compute-1 sudo[265196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:08:15 compute-1 sudo[265196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:15 compute-1 sudo[265196]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:15 compute-1 sudo[265221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:08:15 compute-1 sudo[265221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:15 compute-1 sshd-session[265099]: Failed password for root from 192.210.160.141 port 44530 ssh2
Sep 30 18:08:16 compute-1 sudo[265221]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: pgmap v880: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:08:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:08:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:16 compute-1 sshd-session[265099]: Connection closed by authenticating user root 192.210.160.141 port 44530 [preauth]
Sep 30 18:08:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:16.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:17.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:18 compute-1 nova_compute[238822]: 2025-09-30 18:08:18.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:18.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:18 compute-1 ceph-mon[75484]: pgmap v881: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:08:19 compute-1 nova_compute[238822]: 2025-09-30 18:08:19.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:19 compute-1 nova_compute[238822]: 2025-09-30 18:08:19.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:19 compute-1 nova_compute[238822]: 2025-09-30 18:08:19.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:08:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:19.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: ERROR   18:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: ERROR   18:08:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: ERROR   18:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: ERROR   18:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: ERROR   18:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:08:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:08:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:20 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:20.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:20 compute-1 ceph-mon[75484]: pgmap v882: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:08:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1884273996' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:21.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:21 compute-1 sudo[265293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:08:21 compute-1 sudo[265293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:21 compute-1 sudo[265293]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:21 compute-1 podman[265317]: 2025-09-30 18:08:21.409729555 +0000 UTC m=+0.096993803 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 18:08:21 compute-1 podman[265318]: 2025-09-30 18:08:21.430095376 +0000 UTC m=+0.111200987 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Sep 30 18:08:21 compute-1 podman[265319]: 2025-09-30 18:08:21.44393525 +0000 UTC m=+0.120875149 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 18:08:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:21 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd524000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:22 compute-1 nova_compute[238822]: 2025-09-30 18:08:22.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:22 compute-1 nova_compute[238822]: 2025-09-30 18:08:22.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:08:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2793732369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:22 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:08:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 8889 writes, 33K keys, 8889 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 8889 writes, 2071 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 745 writes, 1468 keys, 745 commit groups, 1.0 writes per commit group, ingest: 0.59 MB, 0.00 MB/s
                                           Interval WAL: 745 writes, 364 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:08:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:22.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:23 compute-1 nova_compute[238822]: 2025-09-30 18:08:23.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:23.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:23 compute-1 ceph-mon[75484]: pgmap v883: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 18:08:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:08:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180823 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:08:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:23 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:08:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:24 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.572 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:08:24 compute-1 nova_compute[238822]: 2025-09-30 18:08:24.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:08:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:24.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:08:25 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/766055163' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.038 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:08:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:25.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:25 compute-1 ceph-mon[75484]: pgmap v884: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:08:25 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4169829947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:25 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/766055163' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.281 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.283 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.309 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.310 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5058MB free_disk=39.9921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.311 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:08:25 compute-1 nova_compute[238822]: 2025-09-30 18:08:25.312 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:08:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:25 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:26 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:26 compute-1 nova_compute[238822]: 2025-09-30 18:08:26.353 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:08:26 compute-1 nova_compute[238822]: 2025-09-30 18:08:26.353 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:08:25 up  3:45,  0 user,  load average: 2.56, 1.32, 1.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:08:26 compute-1 nova_compute[238822]: 2025-09-30 18:08:26.369 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:08:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:08:26 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3746705995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:26 compute-1 nova_compute[238822]: 2025-09-30 18:08:26.811 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:08:26 compute-1 nova_compute[238822]: 2025-09-30 18:08:26.817 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:08:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:08:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:26.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 18:08:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:27.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:27 compute-1 ceph-mon[75484]: pgmap v885: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:08:27 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3746705995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:08:27 compute-1 nova_compute[238822]: 2025-09-30 18:08:27.327 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:08:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:27 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd5000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:27 compute-1 nova_compute[238822]: 2025-09-30 18:08:27.847 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:08:27 compute-1 nova_compute[238822]: 2025-09-30 18:08:27.848 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.536s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:08:28 compute-1 sudo[265433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:08:28 compute-1 sudo[265433]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:28 compute-1 sudo[265433]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:28 compute-1 ceph-mon[75484]: pgmap v886: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 853 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:08:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:28 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:28.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:29.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e135 e135: 2 total, 2 up, 2 in
Sep 30 18:08:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:29 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e136 e136: 2 total, 2 up, 2 in
Sep 30 18:08:30 compute-1 ceph-mon[75484]: osdmap e135: 2 total, 2 up, 2 in
Sep 30 18:08:30 compute-1 ceph-mon[75484]: pgmap v888: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 102 B/s wr, 8 op/s
Sep 30 18:08:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:30 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:30.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:31.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:31 compute-1 ceph-mon[75484]: osdmap e136: 2 total, 2 up, 2 in
Sep 30 18:08:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1913416882' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:08:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:31 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd5000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4278792369' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:08:32 compute-1 ceph-mon[75484]: pgmap v890: 353 pgs: 353 active+clean; 41 MiB data, 148 MiB used, 40 GiB / 40 GiB avail; 2.6 MiB/s rd, 127 B/s wr, 10 op/s
Sep 30 18:08:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:32 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:32.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:33.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:33 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:34 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:34 compute-1 unix_chkpwd[265466]: password check failed for user (root)
Sep 30 18:08:34 compute-1 sshd-session[265463]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:08:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:34.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:34 compute-1 ceph-mon[75484]: pgmap v891: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 51 op/s
Sep 30 18:08:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:35.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:35 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 e137: 2 total, 2 up, 2 in
Sep 30 18:08:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:35 compute-1 podman[249638]: time="2025-09-30T18:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:08:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:08:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:35 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8788 "" "Go-http-client/1.1"
Sep 30 18:08:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:36 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:36 compute-1 ceph-mon[75484]: osdmap e137: 2 total, 2 up, 2 in
Sep 30 18:08:36 compute-1 ceph-mon[75484]: pgmap v893: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 31 KiB/s rd, 3.2 MiB/s wr, 48 op/s
Sep 30 18:08:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/923608034' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:08:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/923608034' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:08:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:36 compute-1 sshd-session[265463]: Failed password for root from 175.126.165.170 port 46082 ssh2
Sep 30 18:08:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:08:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:36.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:08:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:37.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:37 compute-1 sshd-session[265463]: Received disconnect from 175.126.165.170 port 46082:11: Bye Bye [preauth]
Sep 30 18:08:37 compute-1 sshd-session[265463]: Disconnected from authenticating user root 175.126.165.170 port 46082 [preauth]
Sep 30 18:08:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:08:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:37 compute-1 podman[265472]: 2025-09-30 18:08:37.620954178 +0000 UTC m=+0.153188103 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:08:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:37 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:38 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:38 compute-1 sshd-session[265469]: Invalid user inspur from 192.210.160.141 port 49914
Sep 30 18:08:38 compute-1 sshd-session[265469]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:08:38 compute-1 sshd-session[265469]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:08:38 compute-1 podman[265501]: 2025-09-30 18:08:38.432955528 +0000 UTC m=+0.092619895 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:08:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:38 compute-1 ceph-mon[75484]: pgmap v894: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 26 KiB/s rd, 2.7 MiB/s wr, 40 op/s
Sep 30 18:08:38 compute-1 sshd-session[265498]: Invalid user upload from 194.107.115.65 port 38792
Sep 30 18:08:38 compute-1 sshd-session[265498]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:08:38 compute-1 sshd-session[265498]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:08:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:38.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:39.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:39 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:39 compute-1 sshd-session[265526]: Invalid user PlcmSpIp from 84.51.43.58 port 59814
Sep 30 18:08:39 compute-1 sshd-session[265526]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:08:39 compute-1 sshd-session[265526]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:08:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:40 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:40 compute-1 sshd-session[265469]: Failed password for invalid user inspur from 192.210.160.141 port 49914 ssh2
Sep 30 18:08:40 compute-1 unix_chkpwd[265532]: password check failed for user (root)
Sep 30 18:08:40 compute-1 sshd-session[265528]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:08:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:40.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:40 compute-1 ceph-mon[75484]: pgmap v895: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 1.6 MiB/s rd, 2.2 MiB/s wr, 100 op/s
Sep 30 18:08:41 compute-1 sshd-session[265498]: Failed password for invalid user upload from 194.107.115.65 port 38792 ssh2
Sep 30 18:08:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:41.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:41 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:42 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:42 compute-1 sshd-session[265498]: Received disconnect from 194.107.115.65 port 38792:11: Bye Bye [preauth]
Sep 30 18:08:42 compute-1 sshd-session[265498]: Disconnected from invalid user upload 194.107.115.65 port 38792 [preauth]
Sep 30 18:08:42 compute-1 sshd-session[265526]: Failed password for invalid user PlcmSpIp from 84.51.43.58 port 59814 ssh2
Sep 30 18:08:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:42 compute-1 sshd-session[265526]: Received disconnect from 84.51.43.58 port 59814:11: Bye Bye [preauth]
Sep 30 18:08:42 compute-1 sshd-session[265526]: Disconnected from invalid user PlcmSpIp 84.51.43.58 port 59814 [preauth]
Sep 30 18:08:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:42.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:43 compute-1 ceph-mon[75484]: pgmap v896: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 1.6 MiB/s rd, 2.1 MiB/s wr, 97 op/s
Sep 30 18:08:43 compute-1 sshd-session[265528]: Failed password for root from 216.10.242.161 port 50864 ssh2
Sep 30 18:08:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:43 compute-1 sshd-session[265469]: Connection closed by invalid user inspur 192.210.160.141 port 49914 [preauth]
Sep 30 18:08:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:43 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:43 compute-1 sshd-session[265528]: Received disconnect from 216.10.242.161 port 50864:11: Bye Bye [preauth]
Sep 30 18:08:43 compute-1 sshd-session[265528]: Disconnected from authenticating user root 216.10.242.161 port 50864 [preauth]
Sep 30 18:08:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:44 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:44 compute-1 podman[265536]: 2025-09-30 18:08:44.563412424 +0000 UTC m=+0.098931565 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 18:08:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:44.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:45 compute-1 ceph-mon[75484]: pgmap v897: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 88 op/s
Sep 30 18:08:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:45.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:45 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:46 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:46.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:47 compute-1 ceph-mon[75484]: pgmap v898: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 16 KiB/s wr, 85 op/s
Sep 30 18:08:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:47 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:48 compute-1 sudo[265559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:08:48 compute-1 sudo[265559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:08:48 compute-1 sudo[265559]: pam_unix(sudo:session): session closed for user root
Sep 30 18:08:48 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Sep 30 18:08:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:48 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:48.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:49 compute-1 ceph-mon[75484]: pgmap v899: 353 pgs: 353 active+clean; 88 MiB data, 170 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:08:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:49.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: ERROR   18:08:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: ERROR   18:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: ERROR   18:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: ERROR   18:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: ERROR   18:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:08:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:08:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:49 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:50 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:50.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:51 compute-1 ceph-mon[75484]: pgmap v900: 353 pgs: 353 active+clean; 109 MiB data, 173 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 2.0 MiB/s wr, 104 op/s
Sep 30 18:08:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:51.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:51 compute-1 podman[265589]: 2025-09-30 18:08:51.547250779 +0000 UTC m=+0.088013350 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:08:51 compute-1 podman[265590]: 2025-09-30 18:08:51.565665287 +0000 UTC m=+0.094334061 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Sep 30 18:08:51 compute-1 podman[265596]: 2025-09-30 18:08:51.566031247 +0000 UTC m=+0.083280443 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 18:08:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:51 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd518002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:52 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:52.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:53 compute-1 ceph-mon[75484]: pgmap v901: 353 pgs: 353 active+clean; 109 MiB data, 173 MiB used, 40 GiB / 40 GiB avail; 750 KiB/s rd, 2.0 MiB/s wr, 50 op/s
Sep 30 18:08:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:08:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:53 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:54 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:54.345 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:08:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:54.346 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:08:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:54.347 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:08:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:54.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:55 compute-1 ceph-mon[75484]: pgmap v902: 353 pgs: 353 active+clean; 121 MiB data, 230 MiB used, 40 GiB / 40 GiB avail; 955 KiB/s rd, 2.1 MiB/s wr, 83 op/s
Sep 30 18:08:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:55.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:55 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd524002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:56 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:56.718 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:08:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:56.720 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:08:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:08:56.721 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:08:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:56.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:57 compute-1 ceph-mon[75484]: pgmap v903: 353 pgs: 353 active+clean; 121 MiB data, 230 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:08:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:57.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:57 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/530785160' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:08:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/530785160' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:08:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:58 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:08:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:08:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:08:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:08:58.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:08:59 compute-1 ceph-mon[75484]: pgmap v904: 353 pgs: 353 active+clean; 121 MiB data, 230 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:08:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:08:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:08:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:08:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:08:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:08:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:08:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:08:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:08:59 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd524002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:00 compute-1 sshd-session[265661]: Invalid user oracle from 80.94.95.115 port 23290
Sep 30 18:09:00 compute-1 sshd-session[265661]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:00 compute-1 sshd-session[265661]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115
Sep 30 18:09:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:00 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:00.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:01 compute-1 ceph-mon[75484]: pgmap v905: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 345 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Sep 30 18:09:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2321016534' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:01 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:01 compute-1 sshd-session[265661]: Failed password for invalid user oracle from 80.94.95.115 port 23290 ssh2
Sep 30 18:09:01 compute-1 sshd-session[265639]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:09:01 compute-1 sshd-session[265639]: banner exchange: Connection from 113.249.93.94 port 53304: Connection timed out
Sep 30 18:09:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3078215495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:02 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:02.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:03.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:03 compute-1 ceph-mon[75484]: pgmap v906: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 224 KiB/s rd, 146 KiB/s wr, 61 op/s
Sep 30 18:09:03 compute-1 sshd-session[265661]: Connection closed by invalid user oracle 80.94.95.115 port 23290 [preauth]
Sep 30 18:09:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:03 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd524002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:04 compute-1 ceph-mon[75484]: pgmap v907: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 224 KiB/s rd, 146 KiB/s wr, 61 op/s
Sep 30 18:09:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:04 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:04 compute-1 unix_chkpwd[265671]: password check failed for user (root)
Sep 30 18:09:04 compute-1 sshd-session[265666]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:09:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:09:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:09:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:05.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:05 compute-1 podman[249638]: time="2025-09-30T18:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:09:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:09:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8789 "" "Go-http-client/1.1"
Sep 30 18:09:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:05 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:06 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:06.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:07 compute-1 ceph-mon[75484]: pgmap v908: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 28 op/s
Sep 30 18:09:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:07.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:07 compute-1 sshd-session[265666]: Failed password for root from 192.210.160.141 port 35698 ssh2
Sep 30 18:09:07 compute-1 sshd-session[265674]: Invalid user infra from 107.172.146.104 port 57782
Sep 30 18:09:07 compute-1 sshd-session[265674]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:07 compute-1 sshd-session[265674]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:09:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:07 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:07 compute-1 sshd-session[265666]: Connection closed by authenticating user root 192.210.160.141 port 35698 [preauth]
Sep 30 18:09:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:09:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:08 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd524002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:08 compute-1 sudo[265677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:09:08 compute-1 sudo[265677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:08 compute-1 sudo[265677]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:08 compute-1 podman[265701]: 2025-09-30 18:09:08.579414271 +0000 UTC m=+0.152461682 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:09:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:08 compute-1 podman[265728]: 2025-09-30 18:09:08.684575744 +0000 UTC m=+0.074730581 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:09:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:08.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:09 compute-1 ceph-mon[75484]: pgmap v909: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 28 op/s
Sep 30 18:09:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:09.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:09 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:09 compute-1 sshd-session[265674]: Failed password for invalid user infra from 107.172.146.104 port 57782 ssh2
Sep 30 18:09:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:10 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:10 compute-1 sshd-session[265674]: Received disconnect from 107.172.146.104 port 57782:11: Bye Bye [preauth]
Sep 30 18:09:10 compute-1 sshd-session[265674]: Disconnected from invalid user infra 107.172.146.104 port 57782 [preauth]
Sep 30 18:09:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:11 compute-1 ceph-mon[75484]: pgmap v910: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 28 op/s
Sep 30 18:09:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:11.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:11 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:12 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd5240091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:13 compute-1 ceph-mon[75484]: pgmap v911: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:09:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:13.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:13 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd5240091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:14 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd500003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:15 compute-1 ceph-mon[75484]: pgmap v912: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:09:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:15.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:15 compute-1 podman[265759]: 2025-09-30 18:09:15.520221744 +0000 UTC m=+0.072801509 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 18:09:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[265053]: 30/09/2025 18:09:15 : epoch 68dc1c88 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd51c0045e0 fd 38 proxy ignored for local
Sep 30 18:09:15 compute-1 kernel: ganesha.nfsd[265652]: segfault at 50 ip 00007fd5d039a32e sp 00007fd5917f9210 error 4 in libntirpc.so.5.8[7fd5d037f000+2c000] likely on CPU 4 (core 0, socket 4)
Sep 30 18:09:15 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:09:15 compute-1 systemd[1]: Started Process Core Dump (PID 265780/UID 0).
Sep 30 18:09:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:16.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:17 compute-1 ceph-mon[75484]: pgmap v913: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:09:17 compute-1 systemd-coredump[265781]: Process 265057 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007fd5d039a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:09:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:17.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:17 compute-1 systemd[1]: systemd-coredump@10-265780-0.service: Deactivated successfully.
Sep 30 18:09:17 compute-1 systemd[1]: systemd-coredump@10-265780-0.service: Consumed 1.431s CPU time.
Sep 30 18:09:17 compute-1 podman[265788]: 2025-09-30 18:09:17.315710431 +0000 UTC m=+0.048306757 container died 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Sep 30 18:09:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-4f99c85b9498bd0ec52775d42ec52261284e6e56d3291be8e81bba9b520f0ba4-merged.mount: Deactivated successfully.
Sep 30 18:09:17 compute-1 podman[265788]: 2025-09-30 18:09:17.362183927 +0000 UTC m=+0.094780223 container remove 2463e5ad98f8c5c67f06d23c6be85cc70376bc6d66b7e5541f99c15c9d29b10c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True)
Sep 30 18:09:17 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:09:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:17 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:09:17 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.116s CPU time.
Sep 30 18:09:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:18.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:19 compute-1 ceph-mon[75484]: pgmap v914: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:09:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:19.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: ERROR   18:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: ERROR   18:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: ERROR   18:09:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: ERROR   18:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: ERROR   18:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:09:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:09:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.570 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.570 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.571 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.571 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:09:20 compute-1 nova_compute[238822]: 2025-09-30 18:09:20.571 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:20.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:21 compute-1 ceph-mon[75484]: pgmap v915: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:09:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2128623266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:21.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:21 compute-1 sshd-session[265836]: Invalid user testadmin from 14.225.167.110 port 53138
Sep 30 18:09:21 compute-1 sshd-session[265836]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:21 compute-1 sshd-session[265836]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:09:21 compute-1 sudo[265839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:09:21 compute-1 sudo[265839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:21 compute-1 sudo[265839]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:21 compute-1 sudo[265864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:09:21 compute-1 sudo[265864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180921 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:09:21 compute-1 podman[265889]: 2025-09-30 18:09:21.725585573 +0000 UTC m=+0.099618784 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Sep 30 18:09:21 compute-1 podman[265890]: 2025-09-30 18:09:21.736235701 +0000 UTC m=+0.102681097 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 18:09:21 compute-1 podman[265888]: 2025-09-30 18:09:21.746307314 +0000 UTC m=+0.118374702 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid)
Sep 30 18:09:22 compute-1 sudo[265864]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:22 compute-1 sudo[265979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:09:22 compute-1 sudo[265979]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:22 compute-1 sudo[265979]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:22 compute-1 sudo[266004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- inventory --format=json-pretty --filter-for-batch
Sep 30 18:09:22 compute-1 sudo[266004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:22.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:22 compute-1 podman[266071]: 2025-09-30 18:09:22.987666252 +0000 UTC m=+0.054411202 container create 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 18:09:23 compute-1 systemd[1]: Started libpod-conmon-125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f.scope.
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:22.962887002 +0000 UTC m=+0.029632052 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:09:23 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:23.097356027 +0000 UTC m=+0.164101007 container init 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:23.108788626 +0000 UTC m=+0.175533566 container start 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:23.11188042 +0000 UTC m=+0.178625370 container attach 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:09:23 compute-1 heuristic_clarke[266087]: 167 167
Sep 30 18:09:23 compute-1 systemd[1]: libpod-125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f.scope: Deactivated successfully.
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:23.119964458 +0000 UTC m=+0.186709408 container died 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True)
Sep 30 18:09:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-2994f284b81788e787e4424f064536a6433310ae84b42feac48881b2b09fa28c-merged.mount: Deactivated successfully.
Sep 30 18:09:23 compute-1 ceph-mon[75484]: pgmap v916: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:09:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3742597895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:09:23 compute-1 podman[266071]: 2025-09-30 18:09:23.172153929 +0000 UTC m=+0.238898889 container remove 125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:09:23 compute-1 systemd[1]: libpod-conmon-125bb81b4b65297fa9d7c857bb21303cddf62ed0c39ed59bdff0a01a795ecc2f.scope: Deactivated successfully.
Sep 30 18:09:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:23.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:23.297 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:39:ed 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5947b7c96cd42be8502dbab4c825083', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fffd780-66a8-4f09-9e3d-aefd98ad1eb6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=12cfcc60-6c05-4cc2-8665-8a4d689e5c1a) old=Port_Binding(mac=['fa:16:3e:77:39:ed'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5947b7c96cd42be8502dbab4c825083', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:09:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:23.299 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 12cfcc60-6c05-4cc2-8665-8a4d689e5c1a in datapath 4b8f21c3-21c3-482f-88c7-197b5bceb2ea updated
Sep 30 18:09:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:23.302 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b8f21c3-21c3-482f-88c7-197b5bceb2ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:09:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:23.311 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc9f6a7-2467-4ac2-910b-76df5e81283d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:09:23 compute-1 podman[266112]: 2025-09-30 18:09:23.435435476 +0000 UTC m=+0.073136228 container create c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 18:09:23 compute-1 podman[266112]: 2025-09-30 18:09:23.398894268 +0000 UTC m=+0.036595090 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:09:23 compute-1 systemd[1]: Started libpod-conmon-c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff.scope.
Sep 30 18:09:23 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:09:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38e84f89b35241442a882bc433966d1ec3c33e7ceb6444a4e677e092527f33a3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38e84f89b35241442a882bc433966d1ec3c33e7ceb6444a4e677e092527f33a3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38e84f89b35241442a882bc433966d1ec3c33e7ceb6444a4e677e092527f33a3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38e84f89b35241442a882bc433966d1ec3c33e7ceb6444a4e677e092527f33a3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:23 compute-1 podman[266112]: 2025-09-30 18:09:23.552159182 +0000 UTC m=+0.189859934 container init c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Sep 30 18:09:23 compute-1 podman[266112]: 2025-09-30 18:09:23.566645783 +0000 UTC m=+0.204346495 container start c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:09:23 compute-1 podman[266112]: 2025-09-30 18:09:23.570712573 +0000 UTC m=+0.208413305 container attach c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Sep 30 18:09:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:24 compute-1 sshd-session[265836]: Failed password for invalid user testadmin from 14.225.167.110 port 53138 ssh2
Sep 30 18:09:24 compute-1 dazzling_pare[266128]: [
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:     {
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "available": false,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "being_replaced": false,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "ceph_device_lvm": false,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "device_id": "QEMU_DVD-ROM_QM00001",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "lsm_data": {},
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "lvs": [],
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "path": "/dev/sr0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "rejected_reasons": [
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "Has a FileSystem",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "Insufficient space (<5GB)"
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         ],
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         "sys_api": {
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "actuators": null,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "device_nodes": [
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:                 "sr0"
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             ],
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "devname": "sr0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "human_readable_size": "482.00 KB",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "id_bus": "ata",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "model": "QEMU DVD-ROM",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "nr_requests": "2",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "parent": "/dev/sr0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "partitions": {},
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "path": "/dev/sr0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "removable": "1",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "rev": "2.5+",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "ro": "0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "rotational": "0",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "sas_address": "",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "sas_device_handle": "",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "scheduler_mode": "mq-deadline",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "sectors": 0,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "sectorsize": "2048",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "size": 493568.0,
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "support_discard": "2048",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "type": "disk",
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:             "vendor": "QEMU"
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:         }
Sep 30 18:09:24 compute-1 dazzling_pare[266128]:     }
Sep 30 18:09:24 compute-1 dazzling_pare[266128]: ]
Sep 30 18:09:24 compute-1 systemd[1]: libpod-c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff.scope: Deactivated successfully.
Sep 30 18:09:24 compute-1 podman[266112]: 2025-09-30 18:09:24.543100819 +0000 UTC m=+1.180801571 container died c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default)
Sep 30 18:09:24 compute-1 nova_compute[238822]: 2025-09-30 18:09:24.563 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:24 compute-1 nova_compute[238822]: 2025-09-30 18:09:24.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:24 compute-1 nova_compute[238822]: 2025-09-30 18:09:24.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-38e84f89b35241442a882bc433966d1ec3c33e7ceb6444a4e677e092527f33a3-merged.mount: Deactivated successfully.
Sep 30 18:09:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:24 compute-1 podman[266112]: 2025-09-30 18:09:24.605879916 +0000 UTC m=+1.243580668 container remove c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dazzling_pare, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid)
Sep 30 18:09:24 compute-1 systemd[1]: libpod-conmon-c190ff3055c4cd9bd66bdbc0d596392a75aa552661b60817abdc7846ce21efff.scope: Deactivated successfully.
Sep 30 18:09:24 compute-1 sudo[266004]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:24.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:25 compute-1 ceph-mon[75484]: pgmap v917: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:09:25 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:09:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:25.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:09:25 compute-1 nova_compute[238822]: 2025-09-30 18:09:25.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:09:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:09:26 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4275842867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.046 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:09:26 compute-1 sshd-session[265836]: Received disconnect from 14.225.167.110 port 53138:11: Bye Bye [preauth]
Sep 30 18:09:26 compute-1 sshd-session[265836]: Disconnected from invalid user testadmin 14.225.167.110 port 53138 [preauth]
Sep 30 18:09:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4275842867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.246 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.247 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.286 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.287 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5025MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.287 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:09:26 compute-1 nova_compute[238822]: 2025-09-30 18:09:26.288 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:09:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:26.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:27 compute-1 ceph-mon[75484]: pgmap v918: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:09:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:27.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:27 compute-1 nova_compute[238822]: 2025-09-30 18:09:27.334 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:09:27 compute-1 nova_compute[238822]: 2025-09-30 18:09:27.335 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:09:26 up  3:46,  0 user,  load average: 4.33, 1.88, 1.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:09:27 compute-1 nova_compute[238822]: 2025-09-30 18:09:27.350 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:09:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:27 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 11.
Sep 30 18:09:27 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:09:27 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.116s CPU time.
Sep 30 18:09:27 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 18:09:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:09:27 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2291823085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:27 compute-1 nova_compute[238822]: 2025-09-30 18:09:27.883 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:09:27 compute-1 nova_compute[238822]: 2025-09-30 18:09:27.894 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:09:27 compute-1 podman[267448]: 2025-09-30 18:09:27.961571051 +0000 UTC m=+0.046512447 container create 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Sep 30 18:09:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1eb72020226e1400eb74a29d645c88870f63c2b1be10ba3c59ff3cbc2fb979/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1eb72020226e1400eb74a29d645c88870f63c2b1be10ba3c59ff3cbc2fb979/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1eb72020226e1400eb74a29d645c88870f63c2b1be10ba3c59ff3cbc2fb979/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1eb72020226e1400eb74a29d645c88870f63c2b1be10ba3c59ff3cbc2fb979/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:09:28 compute-1 podman[267448]: 2025-09-30 18:09:27.940786209 +0000 UTC m=+0.025727625 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:09:28 compute-1 podman[267448]: 2025-09-30 18:09:28.041325727 +0000 UTC m=+0.126267163 container init 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Sep 30 18:09:28 compute-1 podman[267448]: 2025-09-30 18:09:28.054799081 +0000 UTC m=+0.139740517 container start 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 18:09:28 compute-1 bash[267448]: 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b
Sep 30 18:09:28 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:09:28 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2291823085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:28 compute-1 ceph-mon[75484]: pgmap v919: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:09:28 compute-1 nova_compute[238822]: 2025-09-30 18:09:28.406 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:09:28 compute-1 sudo[267508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:09:28 compute-1 sudo[267508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:28 compute-1 sudo[267508]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:28 compute-1 nova_compute[238822]: 2025-09-30 18:09:28.920 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:09:28 compute-1 nova_compute[238822]: 2025-09-30 18:09:28.920 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.632s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:09:28 compute-1 nova_compute[238822]: 2025-09-30 18:09:28.920 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:28 compute-1 nova_compute[238822]: 2025-09-30 18:09:28.920 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:09:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:28.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:29.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:29 compute-1 nova_compute[238822]: 2025-09-30 18:09:29.427 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:09:29 compute-1 nova_compute[238822]: 2025-09-30 18:09:29.427 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:29 compute-1 nova_compute[238822]: 2025-09-30 18:09:29.427 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:09:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:29 compute-1 unix_chkpwd[267534]: password check failed for user (root)
Sep 30 18:09:29 compute-1 sshd-session[267379]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:09:29 compute-1 sudo[267535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:09:29 compute-1 sudo[267535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:29 compute-1 sudo[267535]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:09:30 compute-1 ceph-mon[75484]: pgmap v920: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:09:30 compute-1 nova_compute[238822]: 2025-09-30 18:09:30.935 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:30.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:31.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:31 compute-1 sshd[170789]: Timeout before authentication for connection from 14.103.129.43 to 38.102.83.102, pid = 264650
Sep 30 18:09:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:31 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:31.620 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:a4:93 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c06e9657-589e-4dca-93a3-44b9a4da38f4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c06e9657-589e-4dca-93a3-44b9a4da38f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0783e60216244dbda21696efa03e2275', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb4927a-fd33-46be-8d3d-6e2898831ff5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=552f1fba-62a5-48e5-9d1c-51c58cbe4d6e) old=Port_Binding(mac=['fa:16:3e:a1:a4:93'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c06e9657-589e-4dca-93a3-44b9a4da38f4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c06e9657-589e-4dca-93a3-44b9a4da38f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0783e60216244dbda21696efa03e2275', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:09:31 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:31.621 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 552f1fba-62a5-48e5-9d1c-51c58cbe4d6e in datapath c06e9657-589e-4dca-93a3-44b9a4da38f4 updated
Sep 30 18:09:31 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:31.622 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c06e9657-589e-4dca-93a3-44b9a4da38f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:09:31 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:31.624 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e0157142-b027-4d36-8764-0be8f5c46804]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:09:31 compute-1 sshd-session[267379]: Failed password for root from 192.210.160.141 port 57530 ssh2
Sep 30 18:09:32 compute-1 nova_compute[238822]: 2025-09-30 18:09:32.343 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:09:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:32 compute-1 sshd-session[267379]: Connection closed by authenticating user root 192.210.160.141 port 57530 [preauth]
Sep 30 18:09:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:32.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:33 compute-1 ceph-mon[75484]: pgmap v921: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:09:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:33.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:09:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:09:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:34.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:35 compute-1 ceph-mon[75484]: pgmap v922: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:09:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:35.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:35 compute-1 podman[249638]: time="2025-09-30T18:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:09:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:09:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8791 "" "Go-http-client/1.1"
Sep 30 18:09:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:36.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:37 compute-1 ceph-mon[75484]: pgmap v923: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:09:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3888146747' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:09:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3888146747' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:09:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:37.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:09:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:39.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:39 compute-1 ceph-mon[75484]: pgmap v924: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:09:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:39 compute-1 podman[267571]: 2025-09-30 18:09:39.52840965 +0000 UTC m=+0.061287538 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:09:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:39 compute-1 podman[267570]: 2025-09-30 18:09:39.62420059 +0000 UTC m=+0.157247902 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:41.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:41 compute-1 ceph-mon[75484]: pgmap v925: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:09:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:43.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:43 compute-1 ceph-mon[75484]: pgmap v926: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 1 op/s
Sep 30 18:09:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:43.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/180943 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:09:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:45.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:45 compute-1 ceph-mon[75484]: pgmap v927: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:09:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:45.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:46 compute-1 unix_chkpwd[267645]: password check failed for user (root)
Sep 30 18:09:46 compute-1 sshd-session[267642]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:09:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1512167147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:09:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:46 compute-1 podman[267646]: 2025-09-30 18:09:46.555436924 +0000 UTC m=+0.092213543 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:09:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:47.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:47 compute-1 ceph-mon[75484]: pgmap v928: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:09:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:47.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:47 compute-1 sshd-session[267666]: Invalid user grid from 216.10.242.161 port 34440
Sep 30 18:09:48 compute-1 sshd-session[267666]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:48 compute-1 sshd-session[267666]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:09:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:48 compute-1 sshd-session[267642]: Failed password for root from 175.126.165.170 port 34652 ssh2
Sep 30 18:09:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:48 compute-1 sudo[267669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:09:48 compute-1 sudo[267669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:09:48 compute-1 sudo[267669]: pam_unix(sudo:session): session closed for user root
Sep 30 18:09:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:49.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:49 compute-1 sshd-session[267642]: Received disconnect from 175.126.165.170 port 34652:11: Bye Bye [preauth]
Sep 30 18:09:49 compute-1 sshd-session[267642]: Disconnected from authenticating user root 175.126.165.170 port 34652 [preauth]
Sep 30 18:09:49 compute-1 ceph-mon[75484]: pgmap v929: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:09:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:49.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: ERROR   18:09:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: ERROR   18:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: ERROR   18:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: ERROR   18:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: ERROR   18:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:09:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:09:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:49 compute-1 sshd-session[267666]: Failed password for invalid user grid from 216.10.242.161 port 34440 ssh2
Sep 30 18:09:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:50 compute-1 ceph-mon[75484]: pgmap v930: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:09:50 compute-1 sshd-session[267666]: Received disconnect from 216.10.242.161 port 34440:11: Bye Bye [preauth]
Sep 30 18:09:50 compute-1 sshd-session[267666]: Disconnected from invalid user grid 216.10.242.161 port 34440 [preauth]
Sep 30 18:09:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:51.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:51.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:51 compute-1 sshd-session[267696]: Invalid user gpadmin from 194.107.115.65 port 63272
Sep 30 18:09:51 compute-1 sshd-session[267696]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:51 compute-1 sshd-session[267696]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:09:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:52 compute-1 podman[267701]: 2025-09-30 18:09:52.548707752 +0000 UTC m=+0.085404930 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid)
Sep 30 18:09:52 compute-1 podman[267702]: 2025-09-30 18:09:52.56009318 +0000 UTC m=+0.095229445 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Sep 30 18:09:52 compute-1 podman[267703]: 2025-09-30 18:09:52.571337084 +0000 UTC m=+0.092732648 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 18:09:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:53.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:53 compute-1 ceph-mon[75484]: pgmap v931: 353 pgs: 353 active+clean; 41 MiB data, 184 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:09:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:09:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:53.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:54.348 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:09:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:54.348 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:09:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:09:54.348 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:09:54 compute-1 sshd-session[267696]: Failed password for invalid user gpadmin from 194.107.115.65 port 63272 ssh2
Sep 30 18:09:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:54 compute-1 unix_chkpwd[267764]: password check failed for user (root)
Sep 30 18:09:54 compute-1 sshd-session[267699]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:09:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:55.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:55 compute-1 ceph-mon[75484]: pgmap v932: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:09:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:55.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:55 compute-1 sshd-session[267696]: Received disconnect from 194.107.115.65 port 63272:11: Bye Bye [preauth]
Sep 30 18:09:55 compute-1 sshd-session[267696]: Disconnected from invalid user gpadmin 194.107.115.65 port 63272 [preauth]
Sep 30 18:09:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4272442282' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.084741) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796084795, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2361, "num_deletes": 251, "total_data_size": 6071880, "memory_usage": 6150928, "flush_reason": "Manual Compaction"}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796120155, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3904627, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26085, "largest_seqno": 28441, "table_properties": {"data_size": 3895243, "index_size": 5878, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19661, "raw_average_key_size": 20, "raw_value_size": 3876218, "raw_average_value_size": 4000, "num_data_blocks": 261, "num_entries": 969, "num_filter_entries": 969, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255588, "oldest_key_time": 1759255588, "file_creation_time": 1759255796, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 35488 microseconds, and 17233 cpu microseconds.
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.120228) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3904627 bytes OK
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.120259) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.122162) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.122184) EVENT_LOG_v1 {"time_micros": 1759255796122176, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.122211) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6061414, prev total WAL file size 6061414, number of live WAL files 2.
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.124457) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3813KB)], [51(10MB)]
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796124539, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 15136036, "oldest_snapshot_seqno": -1}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5591 keys, 13047601 bytes, temperature: kUnknown
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796213746, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 13047601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13009194, "index_size": 23321, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14021, "raw_key_size": 141569, "raw_average_key_size": 25, "raw_value_size": 12906865, "raw_average_value_size": 2308, "num_data_blocks": 961, "num_entries": 5591, "num_filter_entries": 5591, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255796, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.214422) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 13047601 bytes
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.216347) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.9 rd, 145.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 10.7 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(7.2) write-amplify(3.3) OK, records in: 6113, records dropped: 522 output_compression: NoCompression
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.216385) EVENT_LOG_v1 {"time_micros": 1759255796216367, "job": 30, "event": "compaction_finished", "compaction_time_micros": 89594, "compaction_time_cpu_micros": 46808, "output_level": 6, "num_output_files": 1, "total_output_size": 13047601, "num_input_records": 6113, "num_output_records": 5591, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796219027, "job": 30, "event": "table_file_deletion", "file_number": 53}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255796224407, "job": 30, "event": "table_file_deletion", "file_number": 51}
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.124339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.224762) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.224769) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.224771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 sshd-session[267699]: Failed password for root from 192.210.160.141 port 54626 ssh2
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.224772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:09:56.224774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:09:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:57.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:57 compute-1 ceph-mon[75484]: pgmap v933: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:09:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3049268746' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:09:57 compute-1 sshd-session[267767]: Invalid user fermin from 84.51.43.58 port 51217
Sep 30 18:09:57 compute-1 sshd-session[267767]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:09:57 compute-1 sshd-session[267767]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:09:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:09:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:57.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:09:57 compute-1 sshd-session[267699]: Connection closed by authenticating user root 192.210.160.141 port 54626 [preauth]
Sep 30 18:09:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2543639361' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:09:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2543639361' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:09:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:09:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:09:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:09:59.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:59 compute-1 ceph-mon[75484]: pgmap v934: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:09:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:09:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:09:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:09:59.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:09:59 compute-1 sshd-session[267767]: Failed password for invalid user fermin from 84.51.43.58 port 51217 ssh2
Sep 30 18:09:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:09:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:09:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:09:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:09:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 18:10:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:00 compute-1 sshd-session[267767]: Received disconnect from 84.51.43.58 port 51217:11: Bye Bye [preauth]
Sep 30 18:10:00 compute-1 sshd-session[267767]: Disconnected from invalid user fermin 84.51.43.58 port 51217 [preauth]
Sep 30 18:10:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:01.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:01 compute-1 ceph-mon[75484]: pgmap v935: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:10:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:01.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:10:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:03.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:10:03 compute-1 ceph-mon[75484]: pgmap v936: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:10:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:03.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:05.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:05 compute-1 ceph-mon[75484]: pgmap v937: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 97 op/s
Sep 30 18:10:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:05.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:05 compute-1 podman[249638]: time="2025-09-30T18:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:10:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:10:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8790 "" "Go-http-client/1.1"
Sep 30 18:10:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:06 compute-1 ceph-mon[75484]: pgmap v938: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 13 KiB/s wr, 70 op/s
Sep 30 18:10:06 compute-1 sshd-session[267780]: Invalid user superadmin from 107.172.146.104 port 51338
Sep 30 18:10:06 compute-1 sshd-session[267780]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:10:06 compute-1 sshd-session[267780]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:10:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:07.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:07.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:10:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:08 compute-1 sshd-session[267780]: Failed password for invalid user superadmin from 107.172.146.104 port 51338 ssh2
Sep 30 18:10:08 compute-1 nova_compute[238822]: 2025-09-30 18:10:08.166 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:08 compute-1 nova_compute[238822]: 2025-09-30 18:10:08.167 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:08 compute-1 sshd-session[267780]: Received disconnect from 107.172.146.104 port 51338:11: Bye Bye [preauth]
Sep 30 18:10:08 compute-1 sshd-session[267780]: Disconnected from invalid user superadmin 107.172.146.104 port 51338 [preauth]
Sep 30 18:10:08 compute-1 sshd-session[267770]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:10:08 compute-1 sshd-session[267770]: banner exchange: Connection from 113.249.93.94 port 3258: Connection timed out
Sep 30 18:10:08 compute-1 ceph-mon[75484]: pgmap v939: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 13 KiB/s wr, 70 op/s
Sep 30 18:10:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:08 compute-1 nova_compute[238822]: 2025-09-30 18:10:08.673 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:10:08 compute-1 sudo[267784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:10:08 compute-1 sudo[267784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:08 compute-1 sudo[267784]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:08 compute-1 sshd[170789]: Timeout before authentication for connection from 14.103.129.43 to 38.102.83.102, pid = 264998
Sep 30 18:10:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:09.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:09 compute-1 nova_compute[238822]: 2025-09-30 18:10:09.295 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:09 compute-1 nova_compute[238822]: 2025-09-30 18:10:09.296 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:09 compute-1 nova_compute[238822]: 2025-09-30 18:10:09.306 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:10:09 compute-1 nova_compute[238822]: 2025-09-30 18:10:09.306 2 INFO nova.compute.claims [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:10:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:09.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:10 compute-1 nova_compute[238822]: 2025-09-30 18:10:10.365 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:10 compute-1 podman[267813]: 2025-09-30 18:10:10.54373106 +0000 UTC m=+0.076733555 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:10:10 compute-1 podman[267812]: 2025-09-30 18:10:10.580300829 +0000 UTC m=+0.123418148 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:10:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:10:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2824111221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 nova_compute[238822]: 2025-09-30 18:10:10.849 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 nova_compute[238822]: 2025-09-30 18:10:10.859 2 DEBUG nova.compute.provider_tree [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Sep 30 18:10:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Sep 30 18:10:11 compute-1 ceph-mon[75484]: pgmap v940: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:10:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2824111221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:11.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:11.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:11 compute-1 nova_compute[238822]: 2025-09-30 18:10:11.369 2 DEBUG nova.scheduler.client.report [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:10:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:11 compute-1 nova_compute[238822]: 2025-09-30 18:10:11.887 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.591s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:11 compute-1 nova_compute[238822]: 2025-09-30 18:10:11.888 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:10:12 compute-1 nova_compute[238822]: 2025-09-30 18:10:12.403 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:10:12 compute-1 nova_compute[238822]: 2025-09-30 18:10:12.404 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:10:12 compute-1 nova_compute[238822]: 2025-09-30 18:10:12.406 2 WARNING neutronclient.v2_0.client [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:12 compute-1 nova_compute[238822]: 2025-09-30 18:10:12.409 2 WARNING neutronclient.v2_0.client [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:12 compute-1 nova_compute[238822]: 2025-09-30 18:10:12.921 2 INFO nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:10:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:13.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:13 compute-1 ceph-mon[75484]: pgmap v941: 353 pgs: 353 active+clean; 88 MiB data, 205 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:10:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:13.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:13 compute-1 nova_compute[238822]: 2025-09-30 18:10:13.434 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:10:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:13.543 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:10:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:13.544 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:10:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.463 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.465 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.466 2 INFO nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Creating image(s)
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.499 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.530 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.562 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.566 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.567 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:14 compute-1 nova_compute[238822]: 2025-09-30 18:10:14.732 2 DEBUG nova.virt.libvirt.imagebackend [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Image locations are: [{'url': 'rbd://63d32c6a-fa18-54ed-8711-9a3915cc367b/images/5b99cbca-b655-4be5-8343-cf504005c42e/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://63d32c6a-fa18-54ed-8711-9a3915cc367b/images/5b99cbca-b655-4be5-8343-cf504005c42e/snap', 'metadata': {}}] clone /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagebackend.py:1110
Sep 30 18:10:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:15.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:15 compute-1 ceph-mon[75484]: pgmap v942: 353 pgs: 353 active+clean; 109 MiB data, 221 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.0 MiB/s wr, 233 op/s
Sep 30 18:10:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:15.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.435 2 DEBUG oslo_utils.imageutils.format_inspector [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.441 2 DEBUG oslo_utils.imageutils.format_inspector [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.442 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.534 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.part --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.535 2 DEBUG nova.virt.images [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] 5b99cbca-b655-4be5-8343-cf504005c42e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.537 2 DEBUG nova.privsep.utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.538 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.part /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.712 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Successfully created port: 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:10:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.786 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.part /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.converted" returned: 0 in 0.248s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.794 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.877 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457.converted --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.878 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.312s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.904 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:15 compute-1 nova_compute[238822]: 2025-09-30 18:10:15.909 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.283 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.374s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.366 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] resizing rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:10:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.493 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.494 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Ensure instance console log exists: /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.495 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.496 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:16 compute-1 nova_compute[238822]: 2025-09-30 18:10:16.497 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:17.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:17 compute-1 ceph-mon[75484]: pgmap v943: 353 pgs: 353 active+clean; 109 MiB data, 221 MiB used, 40 GiB / 40 GiB avail; 434 KiB/s rd, 2.0 MiB/s wr, 162 op/s
Sep 30 18:10:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:17.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.490 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Successfully updated port: 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:10:17 compute-1 podman[268074]: 2025-09-30 18:10:17.543014143 +0000 UTC m=+0.084158436 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.576 2 DEBUG nova.compute.manager [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-changed-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.577 2 DEBUG nova.compute.manager [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Refreshing instance network info cache due to event network-changed-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.577 2 DEBUG oslo_concurrency.lockutils [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.577 2 DEBUG oslo_concurrency.lockutils [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:10:17 compute-1 nova_compute[238822]: 2025-09-30 18:10:17.578 2 DEBUG nova.network.neutron [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Refreshing network info cache for port 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:10:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:18 compute-1 nova_compute[238822]: 2025-09-30 18:10:18.001 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:10:18 compute-1 nova_compute[238822]: 2025-09-30 18:10:18.085 2 WARNING neutronclient.v2_0.client [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:18 compute-1 sshd-session[267976]: Invalid user wuzj from 192.210.160.141 port 46634
Sep 30 18:10:18 compute-1 sshd-session[267976]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:10:18 compute-1 sshd-session[267976]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:10:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:18 compute-1 nova_compute[238822]: 2025-09-30 18:10:18.464 2 DEBUG nova.network.neutron [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:10:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:18.546 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:18 compute-1 nova_compute[238822]: 2025-09-30 18:10:18.574 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:18 compute-1 nova_compute[238822]: 2025-09-30 18:10:18.635 2 DEBUG nova.network.neutron [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:10:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:19.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:19 compute-1 nova_compute[238822]: 2025-09-30 18:10:19.143 2 DEBUG oslo_concurrency.lockutils [req-abd84bf9-d5f3-475e-a768-b1939363956d req-3409a416-2ce1-40ee-8839-0206b432415a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:10:19 compute-1 nova_compute[238822]: 2025-09-30 18:10:19.144 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquired lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:10:19 compute-1 nova_compute[238822]: 2025-09-30 18:10:19.145 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:10:19 compute-1 ceph-mon[75484]: pgmap v944: 353 pgs: 353 active+clean; 109 MiB data, 221 MiB used, 40 GiB / 40 GiB avail; 434 KiB/s rd, 2.0 MiB/s wr, 162 op/s
Sep 30 18:10:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:19.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: ERROR   18:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: ERROR   18:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: ERROR   18:10:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: ERROR   18:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: ERROR   18:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:10:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:10:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:19 compute-1 sshd-session[267976]: Failed password for invalid user wuzj from 192.210.160.141 port 46634 ssh2
Sep 30 18:10:20 compute-1 nova_compute[238822]: 2025-09-30 18:10:20.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:20 compute-1 nova_compute[238822]: 2025-09-30 18:10:20.639 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:10:20 compute-1 sshd-session[267976]: Connection closed by invalid user wuzj 192.210.160.141 port 46634 [preauth]
Sep 30 18:10:21 compute-1 nova_compute[238822]: 2025-09-30 18:10:21.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:21 compute-1 nova_compute[238822]: 2025-09-30 18:10:21.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:10:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:21.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:21 compute-1 ceph-mon[75484]: pgmap v945: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 213 op/s
Sep 30 18:10:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3352399850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:10:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:21.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:10:21 compute-1 nova_compute[238822]: 2025-09-30 18:10:21.532 2 WARNING neutronclient.v2_0.client [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:21 compute-1 nova_compute[238822]: 2025-09-30 18:10:21.677 2 DEBUG nova.network.neutron [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Updating instance_info_cache with network_info: [{"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:10:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.186 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Releasing lock "refresh_cache-89ea4af3-85f8-42a9-a945-3aac0c8882e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.187 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance network_info: |[{"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.189 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Start _get_guest_xml network_info=[{"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.193 2 WARNING nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.195 2 DEBUG nova.virt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1226574517', uuid='89ea4af3-85f8-42a9-a945-3aac0c8882e9'), owner=OwnerMeta(userid='d8e62d62fa4d4959828354f71c48cd9d', username='tempest-TestDataModel-213655642-project-admin', projectid='0783e60216244dbda21696efa03e2275', projectname='tempest-TestDataModel-213655642'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759255822.1953337) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.202 2 DEBUG nova.virt.libvirt.host [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.203 2 DEBUG nova.virt.libvirt.host [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.210 2 DEBUG nova.virt.libvirt.host [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.210 2 DEBUG nova.virt.libvirt.host [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.211 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.211 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.212 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.212 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.212 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.213 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.213 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.213 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.213 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.214 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.214 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.214 2 DEBUG nova.virt.hardware [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.219 2 DEBUG nova.privsep.utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.219 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:22 compute-1 sshd-session[268097]: Invalid user laravel from 103.153.190.105 port 53842
Sep 30 18:10:22 compute-1 sshd-session[268097]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:10:22 compute-1 sshd-session[268097]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:10:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:22 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:10:22 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2957699765' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.651 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.692 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:22 compute-1 nova_compute[238822]: 2025-09-30 18:10:22.698 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:23.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:10:23 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2372102183' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.140 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.142 2 DEBUG nova.virt.libvirt.vif [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1226574517',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-1226574517',id=3,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0783e60216244dbda21696efa03e2275',ramdisk_id='',reservation_id='r-c09awd3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-213655642',owner_user_name='tempest-TestDataModel-213655642-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:10:13Z,user_data=None,user_id='d8e62d62fa4d4959828354f71c48cd9d',uuid=89ea4af3-85f8-42a9-a945-3aac0c8882e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.146 2 DEBUG nova.network.os_vif_util [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converting VIF {"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.148 2 DEBUG nova.network.os_vif_util [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.151 2 DEBUG nova.objects.instance [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89ea4af3-85f8-42a9-a945-3aac0c8882e9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:10:23 compute-1 ceph-mon[75484]: pgmap v946: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 3.9 MiB/s wr, 210 op/s
Sep 30 18:10:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:10:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2957699765' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:10:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2372102183' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:10:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:23.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:23 compute-1 podman[268164]: 2025-09-30 18:10:23.578332286 +0000 UTC m=+0.110189540 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Sep 30 18:10:23 compute-1 podman[268165]: 2025-09-30 18:10:23.578780428 +0000 UTC m=+0.102489821 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 18:10:23 compute-1 podman[268163]: 2025-09-30 18:10:23.583951128 +0000 UTC m=+0.125355120 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 18:10:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.661 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <uuid>89ea4af3-85f8-42a9-a945-3aac0c8882e9</uuid>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <name>instance-00000003</name>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:name>tempest-TestDataModel-server-1226574517</nova:name>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:10:22</nova:creationTime>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:10:23 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:10:23 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:user uuid="d8e62d62fa4d4959828354f71c48cd9d">tempest-TestDataModel-213655642-project-admin</nova:user>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:project uuid="0783e60216244dbda21696efa03e2275">tempest-TestDataModel-213655642</nova:project>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <nova:port uuid="3fa54fd6-b0d8-4662-8a1b-ada2e5532d55">
Sep 30 18:10:23 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <system>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="serial">89ea4af3-85f8-42a9-a945-3aac0c8882e9</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="uuid">89ea4af3-85f8-42a9-a945-3aac0c8882e9</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </system>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <os>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </os>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <features>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </features>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </source>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </source>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:10:23 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:ad:c5:d7"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <target dev="tap3fa54fd6-b0"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/console.log" append="off"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <video>
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </video>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:10:23 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:10:23 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:10:23 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:10:23 compute-1 nova_compute[238822]: </domain>
Sep 30 18:10:23 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.662 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Preparing to wait for external event network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.663 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.663 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.664 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.665 2 DEBUG nova.virt.libvirt.vif [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1226574517',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-1226574517',id=3,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0783e60216244dbda21696efa03e2275',ramdisk_id='',reservation_id='r-c09awd3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-213655642',owner_user_name='tempest-TestDataModel-213655642-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:10:13Z,user_data=None,user_id='d8e62d62fa4d4959828354f71c48cd9d',uuid=89ea4af3-85f8-42a9-a945-3aac0c8882e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.665 2 DEBUG nova.network.os_vif_util [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converting VIF {"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.666 2 DEBUG nova.network.os_vif_util [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.667 2 DEBUG os_vif [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:10:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.761 2 DEBUG ovsdbapp.backend.ovs_idl [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.762 2 DEBUG ovsdbapp.backend.ovs_idl [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.762 2 DEBUG ovsdbapp.backend.ovs_idl [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.783 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.784 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ed473b01-398f-5bc0-b31a-d62d6c1bce09', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:23 compute-1 nova_compute[238822]: 2025-09-30 18:10:23.789 2 INFO oslo.privsep.daemon [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpsu9ml3am/privsep.sock']
Sep 30 18:10:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4147530626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:24 compute-1 nova_compute[238822]: 2025-09-30 18:10:24.555 2 INFO oslo.privsep.daemon [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 18:10:24 compute-1 nova_compute[238822]: 2025-09-30 18:10:24.385 797 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 18:10:24 compute-1 nova_compute[238822]: 2025-09-30 18:10:24.392 797 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 18:10:24 compute-1 nova_compute[238822]: 2025-09-30 18:10:24.394 797 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Sep 30 18:10:24 compute-1 nova_compute[238822]: 2025-09-30 18:10:24.394 797 INFO oslo.privsep.daemon [-] privsep daemon running as pid 797
Sep 30 18:10:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:24 compute-1 sshd-session[268097]: Failed password for invalid user laravel from 103.153.190.105 port 53842 ssh2
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.004 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3fa54fd6-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.005 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3fa54fd6-b0, col_values=(('qos', UUID('dd199c9a-da62-442c-b41d-720ac0b62780')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.006 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3fa54fd6-b0, col_values=(('external_ids', {'iface-id': '3fa54fd6-b0d8-4662-8a1b-ada2e5532d55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:c5:d7', 'vm-uuid': '89ea4af3-85f8-42a9-a945-3aac0c8882e9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:10:25 compute-1 NetworkManager[45549]: <info>  [1759255825.0488] manager: (tap3fa54fd6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:25 compute-1 nova_compute[238822]: 2025-09-30 18:10:25.059 2 INFO os_vif [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0')
Sep 30 18:10:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:10:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:25.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:10:25 compute-1 ceph-mon[75484]: pgmap v947: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 3.9 MiB/s wr, 211 op/s
Sep 30 18:10:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:25.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:26 compute-1 ceph-mon[75484]: pgmap v948: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 1.9 MiB/s wr, 51 op/s
Sep 30 18:10:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.572 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.629 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.630 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.630 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] No VIF found with MAC fa:16:3e:ad:c5:d7, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.631 2 INFO nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Using config drive
Sep 30 18:10:26 compute-1 nova_compute[238822]: 2025-09-30 18:10:26.667 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:10:27 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3274908834' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:27.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.073 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:27 compute-1 sshd-session[268097]: Received disconnect from 103.153.190.105 port 53842:11: Bye Bye [preauth]
Sep 30 18:10:27 compute-1 sshd-session[268097]: Disconnected from invalid user laravel 103.153.190.105 port 53842 [preauth]
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.191 2 WARNING neutronclient.v2_0.client [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:27 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3274908834' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:27.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.751 2 INFO nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Creating config drive at /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.764 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp_u05hbul execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.921 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp_u05hbul" returned: 0 in 0.157s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.962 2 DEBUG nova.storage.rbd_utils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] rbd image 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:10:27 compute-1 nova_compute[238822]: 2025-09-30 18:10:27.967 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.118 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.120 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.193 2 DEBUG oslo_concurrency.processutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config 89ea4af3-85f8-42a9-a945-3aac0c8882e9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.226s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.195 2 INFO nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Deleting local config drive /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9/disk.config because it was imported into RBD.
Sep 30 18:10:28 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:10:28 compute-1 ceph-mon[75484]: pgmap v949: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 1.9 MiB/s wr, 51 op/s
Sep 30 18:10:28 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:10:28 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Sep 30 18:10:28 compute-1 kernel: tap3fa54fd6-b0: entered promiscuous mode
Sep 30 18:10:28 compute-1 NetworkManager[45549]: <info>  [1759255828.3808] manager: (tap3fa54fd6-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Sep 30 18:10:28 compute-1 ovn_controller[135204]: 2025-09-30T18:10:28Z|00040|binding|INFO|Claiming lport 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 for this chassis.
Sep 30 18:10:28 compute-1 ovn_controller[135204]: 2025-09-30T18:10:28Z|00041|binding|INFO|3fa54fd6-b0d8-4662-8a1b-ada2e5532d55: Claiming fa:16:3e:ad:c5:d7 10.100.0.4
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.414 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:c5:d7 10.100.0.4'], port_security=['fa:16:3e:ad:c5:d7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '89ea4af3-85f8-42a9-a945-3aac0c8882e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0783e60216244dbda21696efa03e2275', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44055cfe-7091-4bf5-849f-a5ec90884056', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fffd780-66a8-4f09-9e3d-aefd98ad1eb6, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.415 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 in datapath 4b8f21c3-21c3-482f-88c7-197b5bceb2ea bound to our chassis
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.418 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b8f21c3-21c3-482f-88c7-197b5bceb2ea
Sep 30 18:10:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:28 compute-1 systemd-udevd[268352]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.463 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f7dd19-6f51-442c-9b64-0ad65670f002]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.464 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b8f21c3-21 in ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:10:28 compute-1 NetworkManager[45549]: <info>  [1759255828.4684] device (tap3fa54fd6-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:10:28 compute-1 NetworkManager[45549]: <info>  [1759255828.4694] device (tap3fa54fd6-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.470 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b8f21c3-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.471 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6a9d3127-8ccb-401c-aec8-3509c87b54b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.474 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[64588bdf-9ab6-4311-a70c-79dee929e7cc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:28 compute-1 systemd-machined[195911]: New machine qemu-1-instance-00000003.
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.481 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.483 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.499 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c58a9c-18b2-4a25-8715-470403c0cc3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:28 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Sep 30 18:10:28 compute-1 ovn_controller[135204]: 2025-09-30T18:10:28Z|00042|binding|INFO|Setting lport 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 ovn-installed in OVS
Sep 30 18:10:28 compute-1 ovn_controller[135204]: 2025-09-30T18:10:28Z|00043|binding|INFO|Setting lport 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 up in Southbound
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.512 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.513 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4901MB free_disk=39.92593002319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.513 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.514 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.519 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf47dac-722f-466e-bdca-e6d4cdbee81a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:28.521 144543 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp4lmyt1vz/privsep.sock']
Sep 30 18:10:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.787 2 DEBUG nova.compute.manager [req-94d26e15-c940-4e0e-a21a-684a8bd97e39 req-572d07c8-03f3-4a1c-b401-0813bdf30c7a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.788 2 DEBUG oslo_concurrency.lockutils [req-94d26e15-c940-4e0e-a21a-684a8bd97e39 req-572d07c8-03f3-4a1c-b401-0813bdf30c7a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.789 2 DEBUG oslo_concurrency.lockutils [req-94d26e15-c940-4e0e-a21a-684a8bd97e39 req-572d07c8-03f3-4a1c-b401-0813bdf30c7a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.791 2 DEBUG oslo_concurrency.lockutils [req-94d26e15-c940-4e0e-a21a-684a8bd97e39 req-572d07c8-03f3-4a1c-b401-0813bdf30c7a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:28 compute-1 nova_compute[238822]: 2025-09-30 18:10:28.793 2 DEBUG nova.compute.manager [req-94d26e15-c940-4e0e-a21a-684a8bd97e39 req-572d07c8-03f3-4a1c-b401-0813bdf30c7a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Processing event network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:10:28 compute-1 sudo[268377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:10:28 compute-1 sudo[268377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:28 compute-1 sudo[268377]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:10:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:29.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.315 144543 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.315 144543 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4lmyt1vz/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.129 268403 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.136 268403 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.140 268403 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.140 268403 INFO oslo.privsep.daemon [-] privsep daemon running as pid 268403
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.317 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[0d637272-c359-4adb-b517-9a39503c98d5]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:29.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.633 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 89ea4af3-85f8-42a9-a945-3aac0c8882e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.634 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.634 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:10:28 up  3:47,  0 user,  load average: 1.83, 1.61, 1.41\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_0783e60216244dbda21696efa03e2275': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.726 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:10:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.763 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.763 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 0, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.783 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.785 268403 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.785 268403 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:29.785 268403 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.802 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.853 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.949 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.970 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.978 2 INFO nova.virt.libvirt.driver [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance spawned successfully.
Sep 30 18:10:29 compute-1 nova_compute[238822]: 2025-09-30 18:10:29.979 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:10:30 compute-1 sudo[268451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:10:30 compute-1 sudo[268451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:30 compute-1 sudo[268451]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:30 compute-1 sudo[268487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:10:30 compute-1 sudo[268487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.325 268403 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.331 268403 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 18:10:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:10:30 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3126934234' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.406 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[512070b9-81e0-4d47-bcc8-babb354bacc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.413 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f482f170-3c2a-4d7e-83bc-2ed83f282871]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 systemd-udevd[268358]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:10:30 compute-1 NetworkManager[45549]: <info>  [1759255830.4147] manager: (tap4b8f21c3-20): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.430 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.441 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 39, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.459 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[6be1ad7f-7bfb-4147-a612-742b188ab55a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.463 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[065e2e4f-ad0d-4ff9-8f05-a7f0168b5e4a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.499 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.499 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.500 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.500 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.500 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.501 2 DEBUG nova.virt.libvirt.driver [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:10:30 compute-1 NetworkManager[45549]: <info>  [1759255830.5058] device (tap4b8f21c3-20): carrier: link connected
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.512 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[f568b34e-d13b-41e1-9e2e-ce4a6d3f08de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.532 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0b735d-5109-41a5-9ff5-cad851b0ae74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b8f21c3-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1367208, 'reachable_time': 30927, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268557, 'error': None, 'target': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.549 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[897c1d9e-286b-4099-b07e-675ce523d01c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:39ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1367208, 'tstamp': 1367208}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268558, 'error': None, 'target': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.576 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a45fa1-905d-467d-91d5-8cd3820bdf17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b8f21c3-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:39:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1367208, 'reachable_time': 30927, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268559, 'error': None, 'target': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.628 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0d41c58f-ccf2-456b-84eb-74aded7de561]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.718 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bf611441-5d61-4184-a439-ea0c94cd1691]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.730 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b8f21c3-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.730 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.735 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b8f21c3-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:30 compute-1 NetworkManager[45549]: <info>  [1759255830.7424] manager: (tap4b8f21c3-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Sep 30 18:10:30 compute-1 kernel: tap4b8f21c3-20: entered promiscuous mode
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.748 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b8f21c3-20, col_values=(('external_ids', {'iface-id': '12cfcc60-6c05-4cc2-8665-8a4d689e5c1a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:30 compute-1 ovn_controller[135204]: 2025-09-30T18:10:30Z|00044|binding|INFO|Releasing lport 12cfcc60-6c05-4cc2-8665-8a4d689e5c1a from this chassis (sb_readonly=0)
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.778 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac39327-3840-4e0d-b417-b2ec490ab315]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.779 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.779 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.780 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 4b8f21c3-21c3-482f-88c7-197b5bceb2ea disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.780 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.780 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d00b1d6a-1d40-4936-8da8-bee1fafc1127]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.781 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.781 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[002422b1-5259-473d-b733-643c13cd3ac7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.782 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-4b8f21c3-21c3-482f-88c7-197b5bceb2ea
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 4b8f21c3-21c3-482f-88c7-197b5bceb2ea
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:10:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:30.782 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'env', 'PROCESS_TAG=haproxy-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:10:30 compute-1 sudo[268487]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.837 2 DEBUG nova.compute.manager [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.838 2 DEBUG oslo_concurrency.lockutils [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.838 2 DEBUG oslo_concurrency.lockutils [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.838 2 DEBUG oslo_concurrency.lockutils [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.838 2 DEBUG nova.compute.manager [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] No waiting events found dispatching network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.838 2 WARNING nova.compute.manager [req-a4d5de55-dd73-4a6d-829d-633b70db697d req-8500b57e-4a38-4fe0-a6a6-add495eb73d1 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received unexpected event network-vif-plugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 for instance with vm_state building and task_state spawning.
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.980 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updated inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 39, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.981 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:10:30 compute-1 nova_compute[238822]: 2025-09-30 18:10:30.981 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.014 2 INFO nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Took 16.55 seconds to spawn the instance on the hypervisor.
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.015 2 DEBUG nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:10:31 compute-1 ceph-mon[75484]: pgmap v950: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 1.9 MiB/s wr, 56 op/s
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3126934234' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:10:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:10:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:31.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:31 compute-1 podman[268610]: 2025-09-30 18:10:31.291746373 +0000 UTC m=+0.089611724 container create c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:10:31 compute-1 podman[268610]: 2025-09-30 18:10:31.243427906 +0000 UTC m=+0.041293307 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:10:31 compute-1 systemd[1]: Started libpod-conmon-c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e.scope.
Sep 30 18:10:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:31.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:31 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:10:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ccaba7f05c4e6e57c5a4dd308270f7b89b27073328414a8d1f6d2116486f1ed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:10:31 compute-1 podman[268610]: 2025-09-30 18:10:31.418137399 +0000 UTC m=+0.216002810 container init c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 18:10:31 compute-1 podman[268610]: 2025-09-30 18:10:31.430425222 +0000 UTC m=+0.228290563 container start c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:10:31 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [NOTICE]   (268630) : New worker (268632) forked
Sep 30 18:10:31 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [NOTICE]   (268630) : Loading success.
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.491 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.492 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.978s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:31 compute-1 nova_compute[238822]: 2025-09-30 18:10:31.573 2 INFO nova.compute.manager [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Took 22.39 seconds to build instance.
Sep 30 18:10:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:32 compute-1 nova_compute[238822]: 2025-09-30 18:10:32.081 2 DEBUG oslo_concurrency.lockutils [None req-396fc3b9-2a47-4202-9ce1-00201361b8e7 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.914s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:33.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:33 compute-1 ceph-mon[75484]: pgmap v951: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 3.2 KiB/s rd, 26 KiB/s wr, 5 op/s
Sep 30 18:10:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:33.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:33 compute-1 nova_compute[238822]: 2025-09-30 18:10:33.493 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:10:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:33 compute-1 sshd-session[268642]: Invalid user bpm from 14.225.167.110 port 42192
Sep 30 18:10:33 compute-1 sshd-session[268642]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:10:33 compute-1 sshd-session[268642]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:10:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:35.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:35 compute-1 ceph-mon[75484]: pgmap v952: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 27 KiB/s wr, 74 op/s
Sep 30 18:10:35 compute-1 nova_compute[238822]: 2025-09-30 18:10:35.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:35.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:35 compute-1 sshd-session[268642]: Failed password for invalid user bpm from 14.225.167.110 port 42192 ssh2
Sep 30 18:10:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:35 compute-1 podman[249638]: time="2025-09-30T18:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:10:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39666 "" "Go-http-client/1.1"
Sep 30 18:10:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9271 "" "Go-http-client/1.1"
Sep 30 18:10:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:36 compute-1 sshd-session[268642]: Received disconnect from 14.225.167.110 port 42192:11: Bye Bye [preauth]
Sep 30 18:10:36 compute-1 sshd-session[268642]: Disconnected from invalid user bpm 14.225.167.110 port 42192 [preauth]
Sep 30 18:10:36 compute-1 sudo[268648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:10:36 compute-1 sudo[268648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:36 compute-1 sudo[268648]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.601 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.602 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.603 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.603 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.603 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:36 compute-1 nova_compute[238822]: 2025-09-30 18:10:36.619 2 INFO nova.compute.manager [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Terminating instance
Sep 30 18:10:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:37.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:37 compute-1 ceph-mon[75484]: pgmap v953: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:10:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:10:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:10:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/434169990' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:10:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/434169990' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.143 2 DEBUG nova.compute.manager [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:10:37 compute-1 kernel: tap3fa54fd6-b0 (unregistering): left promiscuous mode
Sep 30 18:10:37 compute-1 NetworkManager[45549]: <info>  [1759255837.2030] device (tap3fa54fd6-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 ovn_controller[135204]: 2025-09-30T18:10:37Z|00045|binding|INFO|Releasing lport 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 from this chassis (sb_readonly=0)
Sep 30 18:10:37 compute-1 ovn_controller[135204]: 2025-09-30T18:10:37Z|00046|binding|INFO|Setting lport 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 down in Southbound
Sep 30 18:10:37 compute-1 ovn_controller[135204]: 2025-09-30T18:10:37Z|00047|binding|INFO|Removing iface tap3fa54fd6-b0 ovn-installed in OVS
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.228 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:c5:d7 10.100.0.4'], port_security=['fa:16:3e:ad:c5:d7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '89ea4af3-85f8-42a9-a945-3aac0c8882e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0783e60216244dbda21696efa03e2275', 'neutron:revision_number': '5', 'neutron:security_group_ids': '44055cfe-7091-4bf5-849f-a5ec90884056', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fffd780-66a8-4f09-9e3d-aefd98ad1eb6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.229 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 in datapath 4b8f21c3-21c3-482f-88c7-197b5bceb2ea unbound from our chassis
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.230 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b8f21c3-21c3-482f-88c7-197b5bceb2ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.231 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[08b17afb-903a-48b1-b143-946125381f82]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.231 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea namespace which is not needed anymore
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Sep 30 18:10:37 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.743s CPU time.
Sep 30 18:10:37 compute-1 systemd-machined[195911]: Machine qemu-1-instance-00000003 terminated.
Sep 30 18:10:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:37.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:37 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [NOTICE]   (268630) : haproxy version is 3.0.5-8e879a5
Sep 30 18:10:37 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [NOTICE]   (268630) : path to executable is /usr/sbin/haproxy
Sep 30 18:10:37 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [WARNING]  (268630) : Exiting Master process...
Sep 30 18:10:37 compute-1 podman[268699]: 2025-09-30 18:10:37.370065359 +0000 UTC m=+0.029322674 container kill c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:10:37 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [ALERT]    (268630) : Current worker (268632) exited with code 143 (Terminated)
Sep 30 18:10:37 compute-1 neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea[268626]: [WARNING]  (268630) : All workers exited. Exiting... (0)
Sep 30 18:10:37 compute-1 systemd[1]: libpod-c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e.scope: Deactivated successfully.
Sep 30 18:10:37 compute-1 conmon[268626]: conmon c9e2ed1d3f5c0301aa2a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e.scope/container/memory.events
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.393 2 INFO nova.virt.libvirt.driver [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Instance destroyed successfully.
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.393 2 DEBUG nova.objects.instance [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lazy-loading 'resources' on Instance uuid 89ea4af3-85f8-42a9-a945-3aac0c8882e9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:10:37 compute-1 podman[268718]: 2025-09-30 18:10:37.427579154 +0000 UTC m=+0.033790245 container died c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:10:37 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e-userdata-shm.mount: Deactivated successfully.
Sep 30 18:10:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-5ccaba7f05c4e6e57c5a4dd308270f7b89b27073328414a8d1f6d2116486f1ed-merged.mount: Deactivated successfully.
Sep 30 18:10:37 compute-1 podman[268718]: 2025-09-30 18:10:37.471586713 +0000 UTC m=+0.077797724 container cleanup c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:10:37 compute-1 systemd[1]: libpod-conmon-c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e.scope: Deactivated successfully.
Sep 30 18:10:37 compute-1 podman[268726]: 2025-09-30 18:10:37.496137817 +0000 UTC m=+0.079943702 container remove c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.516 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f0474fcc-5c81-458f-8a72-d95d1a020d15]: (4, ("Tue Sep 30 06:10:37 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea (c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e)\nc9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e\nTue Sep 30 06:10:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea (c9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e)\nc9e2ed1d3f5c0301aa2a1e786746e2f9d274f3663f18a72ee6d84503c8e7773e\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.519 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a9df9d-8437-476a-b97b-0b3ba3983040]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.520 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b8f21c3-21c3-482f-88c7-197b5bceb2ea.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.521 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[10acf5e3-9811-427e-aadb-01b72bbb75c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.522 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b8f21c3-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 kernel: tap4b8f21c3-20: left promiscuous mode
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.545 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ab06df12-d2da-4041-bed7-2825abc24fdb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.578 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce73c69-ddab-4ae1-b13d-fe3ba644f752]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.579 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6084ea-cc4a-44d1-9284-c48fe8223604]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.603 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a26a642b-e9d5-47f1-bd34-70f073a73f0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1367198, 'reachable_time': 32555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268759, 'error': None, 'target': 'ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.612 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b8f21c3-21c3-482f-88c7-197b5bceb2ea deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:10:37 compute-1 systemd[1]: run-netns-ovnmeta\x2d4b8f21c3\x2d21c3\x2d482f\x2d88c7\x2d197b5bceb2ea.mount: Deactivated successfully.
Sep 30 18:10:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:37.614 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[2711053a-3e4f-43ae-b7f5-2862cb8a24f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:10:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.809 2 DEBUG nova.compute.manager [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.809 2 DEBUG oslo_concurrency.lockutils [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.809 2 DEBUG oslo_concurrency.lockutils [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.810 2 DEBUG oslo_concurrency.lockutils [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.810 2 DEBUG nova.compute.manager [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] No waiting events found dispatching network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.810 2 DEBUG nova.compute.manager [req-7c19301c-c3cc-4268-8984-9b8aa6a0a4fc req-c74e3806-b9e8-4a0a-b969-493ab1336264 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.909 2 DEBUG nova.virt.libvirt.vif [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:10:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1226574517',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-1226574517',id=3,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:10:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0783e60216244dbda21696efa03e2275',ramdisk_id='',reservation_id='r-c09awd3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-213655642',owner_user_name='tempest-TestDataModel-213655642-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:10:31Z,user_data=None,user_id='d8e62d62fa4d4959828354f71c48cd9d',uuid=89ea4af3-85f8-42a9-a945-3aac0c8882e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.910 2 DEBUG nova.network.os_vif_util [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converting VIF {"id": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "address": "fa:16:3e:ad:c5:d7", "network": {"id": "4b8f21c3-21c3-482f-88c7-197b5bceb2ea", "bridge": "br-int", "label": "tempest-TestDataModel-239005640-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c5947b7c96cd42be8502dbab4c825083", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fa54fd6-b0", "ovs_interfaceid": "3fa54fd6-b0d8-4662-8a1b-ada2e5532d55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.911 2 DEBUG nova.network.os_vif_util [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.911 2 DEBUG os_vif [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3fa54fd6-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.919 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dd199c9a-da62-442c-b41d-720ac0b62780) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:37 compute-1 nova_compute[238822]: 2025-09-30 18:10:37.923 2 INFO os_vif [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:c5:d7,bridge_name='br-int',has_traffic_filtering=True,id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55,network=Network(4b8f21c3-21c3-482f-88c7-197b5bceb2ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fa54fd6-b0')
Sep 30 18:10:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.389 2 INFO nova.virt.libvirt.driver [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Deleting instance files /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9_del
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.391 2 INFO nova.virt.libvirt.driver [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Deletion of /var/lib/nova/instances/89ea4af3-85f8-42a9-a945-3aac0c8882e9_del complete
Sep 30 18:10:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.914 2 INFO nova.compute.manager [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Took 1.77 seconds to destroy the instance on the hypervisor.
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.915 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.916 2 DEBUG nova.compute.manager [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.917 2 DEBUG nova.network.neutron [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:10:38 compute-1 nova_compute[238822]: 2025-09-30 18:10:38.917 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:39 compute-1 ceph-mon[75484]: pgmap v954: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:10:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:39.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.464 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:10:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.884 2 DEBUG nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.885 2 DEBUG oslo_concurrency.lockutils [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.885 2 DEBUG oslo_concurrency.lockutils [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.886 2 DEBUG oslo_concurrency.lockutils [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.886 2 DEBUG nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] No waiting events found dispatching network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.886 2 DEBUG nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-unplugged-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.887 2 DEBUG nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Received event network-vif-deleted-3fa54fd6-b0d8-4662-8a1b-ada2e5532d55 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.887 2 INFO nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Neutron deleted interface 3fa54fd6-b0d8-4662-8a1b-ada2e5532d55; detaching it from the instance and deleting it from the info cache
Sep 30 18:10:39 compute-1 nova_compute[238822]: 2025-09-30 18:10:39.887 2 DEBUG nova.network.neutron [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:10:40 compute-1 nova_compute[238822]: 2025-09-30 18:10:40.276 2 DEBUG nova.network.neutron [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:10:40 compute-1 nova_compute[238822]: 2025-09-30 18:10:40.401 2 DEBUG nova.compute.manager [req-5d7e93e8-e50e-4179-aa41-dcb5231c2401 req-a60fbfcd-0bbb-4be0-81c3-5bbd6fc41b77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Detach interface failed, port_id=3fa54fd6-b0d8-4662-8a1b-ada2e5532d55, reason: Instance 89ea4af3-85f8-42a9-a945-3aac0c8882e9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:10:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:40 compute-1 nova_compute[238822]: 2025-09-30 18:10:40.782 2 INFO nova.compute.manager [-] [instance: 89ea4af3-85f8-42a9-a945-3aac0c8882e9] Took 1.87 seconds to deallocate network for instance.
Sep 30 18:10:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:41.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:41 compute-1 ceph-mon[75484]: pgmap v955: 353 pgs: 353 active+clean; 121 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 100 op/s
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.312 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.312 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:41 compute-1 unix_chkpwd[268786]: password check failed for user (root)
Sep 30 18:10:41 compute-1 sshd-session[268782]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:10:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:41.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.373 2 DEBUG oslo_concurrency.processutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:10:41 compute-1 podman[268789]: 2025-09-30 18:10:41.539490602 +0000 UTC m=+0.076934011 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:41 compute-1 podman[268788]: 2025-09-30 18:10:41.582450843 +0000 UTC m=+0.119654255 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Sep 30 18:10:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:10:41 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3101236716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.884 2 DEBUG oslo_concurrency.processutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:10:41 compute-1 nova_compute[238822]: 2025-09-30 18:10:41.892 2 DEBUG nova.compute.provider_tree [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:10:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3101236716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:42 compute-1 nova_compute[238822]: 2025-09-30 18:10:42.403 2 DEBUG nova.scheduler.client.report [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:10:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:42 compute-1 nova_compute[238822]: 2025-09-30 18:10:42.915 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:42 compute-1 nova_compute[238822]: 2025-09-30 18:10:42.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:42 compute-1 nova_compute[238822]: 2025-09-30 18:10:42.942 2 INFO nova.scheduler.client.report [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Deleted allocations for instance 89ea4af3-85f8-42a9-a945-3aac0c8882e9
Sep 30 18:10:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:43.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:43 compute-1 ceph-mon[75484]: pgmap v956: 353 pgs: 353 active+clean; 121 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 2.2 KiB/s wr, 95 op/s
Sep 30 18:10:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:43.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:43 compute-1 sshd-session[268782]: Failed password for root from 192.210.160.141 port 38822 ssh2
Sep 30 18:10:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:43 compute-1 nova_compute[238822]: 2025-09-30 18:10:43.976 2 DEBUG oslo_concurrency.lockutils [None req-d4ba790a-5888-4fc1-90fc-17b210c15b54 d8e62d62fa4d4959828354f71c48cd9d 0783e60216244dbda21696efa03e2275 - - default default] Lock "89ea4af3-85f8-42a9-a945-3aac0c8882e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.374s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:44 compute-1 sshd-session[268782]: Connection closed by authenticating user root 192.210.160.141 port 38822 [preauth]
Sep 30 18:10:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:45.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:45 compute-1 ceph-mon[75484]: pgmap v957: 353 pgs: 353 active+clean; 121 MiB data, 236 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 2.2 KiB/s wr, 95 op/s
Sep 30 18:10:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:45.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.601958) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845602008, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 767, "num_deletes": 255, "total_data_size": 1359310, "memory_usage": 1393648, "flush_reason": "Manual Compaction"}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845611161, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 894046, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28446, "largest_seqno": 29208, "table_properties": {"data_size": 890422, "index_size": 1404, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8185, "raw_average_key_size": 18, "raw_value_size": 882990, "raw_average_value_size": 1997, "num_data_blocks": 63, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255797, "oldest_key_time": 1759255797, "file_creation_time": 1759255845, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 9264 microseconds, and 5753 cpu microseconds.
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.611225) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 894046 bytes OK
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.611253) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.613040) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.613064) EVENT_LOG_v1 {"time_micros": 1759255845613057, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.613128) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 1355210, prev total WAL file size 1355210, number of live WAL files 2.
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.614310) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(873KB)], [54(12MB)]
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845614373, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 13941647, "oldest_snapshot_seqno": -1}
Sep 30 18:10:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5509 keys, 13824417 bytes, temperature: kUnknown
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845692772, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 13824417, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13785286, "index_size": 24237, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13829, "raw_key_size": 141069, "raw_average_key_size": 25, "raw_value_size": 13683126, "raw_average_value_size": 2483, "num_data_blocks": 997, "num_entries": 5509, "num_filter_entries": 5509, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255845, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.693141) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 13824417 bytes
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.694813) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.6 rd, 176.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.4 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(31.1) write-amplify(15.5) OK, records in: 6033, records dropped: 524 output_compression: NoCompression
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.694846) EVENT_LOG_v1 {"time_micros": 1759255845694831, "job": 32, "event": "compaction_finished", "compaction_time_micros": 78484, "compaction_time_cpu_micros": 52219, "output_level": 6, "num_output_files": 1, "total_output_size": 13824417, "num_input_records": 6033, "num_output_records": 5509, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845695345, "job": 32, "event": "table_file_deletion", "file_number": 56}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255845699808, "job": 32, "event": "table_file_deletion", "file_number": 54}
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.614163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.700024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.700034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.700036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.700040) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:10:45.700050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:10:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:46 compute-1 nova_compute[238822]: 2025-09-30 18:10:46.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:46 compute-1 ceph-mon[75484]: pgmap v958: 353 pgs: 353 active+clean; 121 MiB data, 236 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Sep 30 18:10:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:47.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:47.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:47 compute-1 nova_compute[238822]: 2025-09-30 18:10:47.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:48 compute-1 podman[268862]: 2025-09-30 18:10:48.538484655 +0000 UTC m=+0.074273169 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Sep 30 18:10:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:48 compute-1 sudo[268883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:10:48 compute-1 sudo[268883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:10:48 compute-1 sudo[268883]: pam_unix(sudo:session): session closed for user root
Sep 30 18:10:49 compute-1 ceph-mon[75484]: pgmap v959: 353 pgs: 353 active+clean; 121 MiB data, 236 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Sep 30 18:10:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:49.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:49.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: ERROR   18:10:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: ERROR   18:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: ERROR   18:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: ERROR   18:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: ERROR   18:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:10:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:10:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2722150778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:10:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:51 compute-1 ceph-mon[75484]: pgmap v960: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 4.7 KiB/s wr, 54 op/s
Sep 30 18:10:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:10:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:51.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 18:10:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:10:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:51.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:10:51 compute-1 nova_compute[238822]: 2025-09-30 18:10:51.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:51 compute-1 sshd-session[268909]: Invalid user k8s from 216.10.242.161 port 51070
Sep 30 18:10:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:51 compute-1 sshd-session[268909]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:10:51 compute-1 sshd-session[268909]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:10:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:52 compute-1 nova_compute[238822]: 2025-09-30 18:10:52.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:53 compute-1 ceph-mon[75484]: pgmap v961: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:10:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:10:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:53.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:53.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:53 compute-1 sshd-session[268909]: Failed password for invalid user k8s from 216.10.242.161 port 51070 ssh2
Sep 30 18:10:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:54.350 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:10:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:54.350 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:10:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:10:54.350 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:10:54 compute-1 sshd-session[268909]: Received disconnect from 216.10.242.161 port 51070:11: Bye Bye [preauth]
Sep 30 18:10:54 compute-1 sshd-session[268909]: Disconnected from invalid user k8s 216.10.242.161 port 51070 [preauth]
Sep 30 18:10:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:54 compute-1 podman[268916]: 2025-09-30 18:10:54.548705671 +0000 UTC m=+0.089809749 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0)
Sep 30 18:10:54 compute-1 podman[268917]: 2025-09-30 18:10:54.569969416 +0000 UTC m=+0.095857732 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350)
Sep 30 18:10:54 compute-1 podman[268918]: 2025-09-30 18:10:54.579586996 +0000 UTC m=+0.099196832 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 18:10:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:55.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:55 compute-1 ceph-mon[75484]: pgmap v962: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:10:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:55.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:55 compute-1 nova_compute[238822]: 2025-09-30 18:10:55.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:55 compute-1 unix_chkpwd[268979]: password check failed for user (root)
Sep 30 18:10:55 compute-1 sshd-session[268977]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:10:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:56 compute-1 nova_compute[238822]: 2025-09-30 18:10:56.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:57.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:57 compute-1 ceph-mon[75484]: pgmap v963: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:10:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:57 compute-1 sshd-session[268977]: Failed password for root from 175.126.165.170 port 36694 ssh2
Sep 30 18:10:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:57 compute-1 nova_compute[238822]: 2025-09-30 18:10:57.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:10:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/833487038' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:10:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/833487038' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:10:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:10:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:10:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:58 compute-1 sshd-session[268977]: Received disconnect from 175.126.165.170 port 36694:11: Bye Bye [preauth]
Sep 30 18:10:58 compute-1 sshd-session[268977]: Disconnected from authenticating user root 175.126.165.170 port 36694 [preauth]
Sep 30 18:10:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:10:59.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:59 compute-1 ceph-mon[75484]: pgmap v964: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:10:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:10:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:10:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:10:59.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:10:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:10:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:10:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:10:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:10:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009d80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:00 compute-1 unix_chkpwd[268988]: password check failed for user (root)
Sep 30 18:11:00 compute-1 sshd-session[268984]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:11:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:01.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:01 compute-1 ceph-mon[75484]: pgmap v965: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:11:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:11:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:01.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:11:01 compute-1 nova_compute[238822]: 2025-09-30 18:11:01.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:02 compute-1 sshd-session[268984]: Failed password for root from 194.107.115.65 port 31242 ssh2
Sep 30 18:11:02 compute-1 ceph-mon[75484]: pgmap v966: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:02 compute-1 sshd-session[268984]: Received disconnect from 194.107.115.65 port 31242:11: Bye Bye [preauth]
Sep 30 18:11:02 compute-1 sshd-session[268984]: Disconnected from authenticating user root 194.107.115.65 port 31242 [preauth]
Sep 30 18:11:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:02 compute-1 sshd-session[268994]: Invalid user aman from 107.172.146.104 port 42086
Sep 30 18:11:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:02.791 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:11:02 compute-1 nova_compute[238822]: 2025-09-30 18:11:02.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:02.792 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:11:02 compute-1 sshd-session[268994]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:02 compute-1 sshd-session[268994]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:11:02 compute-1 nova_compute[238822]: 2025-09-30 18:11:02.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:03.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:04.794 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:11:05 compute-1 ceph-mon[75484]: pgmap v967: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:05.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:05 compute-1 sshd-session[268994]: Failed password for invalid user aman from 107.172.146.104 port 42086 ssh2
Sep 30 18:11:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:05.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:05 compute-1 podman[249638]: time="2025-09-30T18:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:11:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:11:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8795 "" "Go-http-client/1.1"
Sep 30 18:11:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:06 compute-1 unix_chkpwd[269004]: password check failed for user (root)
Sep 30 18:11:06 compute-1 sshd-session[268999]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:11:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:06 compute-1 nova_compute[238822]: 2025-09-30 18:11:06.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:07 compute-1 ceph-mon[75484]: pgmap v968: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:07.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:07 compute-1 sshd-session[268994]: Received disconnect from 107.172.146.104 port 42086:11: Bye Bye [preauth]
Sep 30 18:11:07 compute-1 sshd-session[268994]: Disconnected from invalid user aman 107.172.146.104 port 42086 [preauth]
Sep 30 18:11:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:07.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:07 compute-1 nova_compute[238822]: 2025-09-30 18:11:07.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:11:08 compute-1 sshd-session[268999]: Failed password for root from 192.210.160.141 port 47730 ssh2
Sep 30 18:11:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001220 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:09 compute-1 sudo[269008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:11:09 compute-1 sudo[269008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:09 compute-1 sudo[269008]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:09 compute-1 ceph-mon[75484]: pgmap v969: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:09.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:09.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:09.467 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:07:7b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '250d452565a2459c8481b499c0227183', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3a8ea0a0-c179-4516-9404-04b68a17e79e) old=Port_Binding(mac=['fa:16:3e:18:07:7b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '250d452565a2459c8481b499c0227183', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:11:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:09.469 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3a8ea0a0-c179-4516-9404-04b68a17e79e in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab updated
Sep 30 18:11:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:09.470 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5fff1904-159a-4b76-8c46-feabf17f29ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:11:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:09.472 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa5fdba-ed22-4376-a7be-35325c83a5a5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:11:09 compute-1 sshd-session[268999]: Connection closed by authenticating user root 192.210.160.141 port 47730 [preauth]
Sep 30 18:11:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:11 compute-1 ceph-mon[75484]: pgmap v970: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:11.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:11.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:11 compute-1 nova_compute[238822]: 2025-09-30 18:11:11.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80021b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:12 compute-1 podman[269037]: 2025-09-30 18:11:12.591013902 +0000 UTC m=+0.124042555 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:11:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:12 compute-1 podman[269036]: 2025-09-30 18:11:12.639070241 +0000 UTC m=+0.174330364 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:11:12 compute-1 nova_compute[238822]: 2025-09-30 18:11:12.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:13.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:13 compute-1 ceph-mon[75484]: pgmap v971: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:13.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:15.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:15 compute-1 ceph-mon[75484]: pgmap v972: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:15.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:15 compute-1 sshd-session[269002]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:11:15 compute-1 sshd-session[269002]: banner exchange: Connection from 113.249.93.94 port 17782: Connection timed out
Sep 30 18:11:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:16.075 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:66:01 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f77bdba-b48a-4510-90bd-ec07e6ccf8ba, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fa4f1b47-85c6-4625-86a5-85ea3743d11f) old=Port_Binding(mac=['fa:16:3e:18:66:01'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:11:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:16.077 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fa4f1b47-85c6-4625-86a5-85ea3743d11f in datapath d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86 updated
Sep 30 18:11:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:16.078 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d56b2601-7f56-4c9d-9a6f-73b6bc9a0f86, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:11:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:16.079 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a03864cf-37e0-4c7d-a425-19ddfab52b0c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:11:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80021b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:16 compute-1 nova_compute[238822]: 2025-09-30 18:11:16.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:17.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:17 compute-1 ceph-mon[75484]: pgmap v973: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:17.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80021b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:17 compute-1 nova_compute[238822]: 2025-09-30 18:11:17.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:19.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:19 compute-1 ceph-mon[75484]: pgmap v974: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: ERROR   18:11:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: ERROR   18:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: ERROR   18:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: ERROR   18:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: ERROR   18:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:11:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:11:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:19.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:19 compute-1 podman[269091]: 2025-09-30 18:11:19.554594078 +0000 UTC m=+0.087232329 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:11:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:20 compute-1 nova_compute[238822]: 2025-09-30 18:11:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:20 compute-1 nova_compute[238822]: 2025-09-30 18:11:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:20 compute-1 ceph-mon[75484]: pgmap v975: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80021b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:21.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1035571615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:21.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:21 compute-1 nova_compute[238822]: 2025-09-30 18:11:21.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:21 compute-1 sshd-session[269111]: Invalid user solr from 84.51.43.58 port 36630
Sep 30 18:11:21 compute-1 sshd-session[269111]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:21 compute-1 sshd-session[269111]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:11:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:22 compute-1 nova_compute[238822]: 2025-09-30 18:11:22.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:22 compute-1 nova_compute[238822]: 2025-09-30 18:11:22.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:11:22 compute-1 ceph-mon[75484]: pgmap v976: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:11:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:22 compute-1 nova_compute[238822]: 2025-09-30 18:11:22.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:23.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:23 compute-1 sshd[170789]: drop connection #2 from [14.103.129.43]:47920 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:11:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:23.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:23 compute-1 sshd-session[269111]: Failed password for invalid user solr from 84.51.43.58 port 36630 ssh2
Sep 30 18:11:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1691222335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:24 compute-1 nova_compute[238822]: 2025-09-30 18:11:24.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:24 compute-1 sshd-session[269111]: Received disconnect from 84.51.43.58 port 36630:11: Bye Bye [preauth]
Sep 30 18:11:24 compute-1 sshd-session[269111]: Disconnected from invalid user solr 84.51.43.58 port 36630 [preauth]
Sep 30 18:11:24 compute-1 unix_chkpwd[269119]: password check failed for user (root)
Sep 30 18:11:24 compute-1 sshd-session[269116]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 18:11:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:24 compute-1 ceph-mon[75484]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:25.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:25 compute-1 sshd-session[269116]: Failed password for root from 167.172.43.167 port 33564 ssh2
Sep 30 18:11:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:25.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:25 compute-1 sshd-session[269116]: Received disconnect from 167.172.43.167 port 33564:11: Bye Bye [preauth]
Sep 30 18:11:25 compute-1 sshd-session[269116]: Disconnected from authenticating user root 167.172.43.167 port 33564 [preauth]
Sep 30 18:11:25 compute-1 podman[269122]: 2025-09-30 18:11:25.573173264 +0000 UTC m=+0.102815805 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc.)
Sep 30 18:11:25 compute-1 podman[269121]: 2025-09-30 18:11:25.592525937 +0000 UTC m=+0.127315518 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Sep 30 18:11:25 compute-1 podman[269123]: 2025-09-30 18:11:25.593286528 +0000 UTC m=+0.124642996 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:11:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.573 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:11:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:26 compute-1 nova_compute[238822]: 2025-09-30 18:11:26.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:27 compute-1 ceph-mon[75484]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:11:27 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/561503774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:27.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.159 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.586s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.364 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.366 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.394 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.396 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4896MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.396 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.397 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:11:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:27.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:27 compute-1 nova_compute[238822]: 2025-09-30 18:11:27.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:28 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/561503774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:28 compute-1 nova_compute[238822]: 2025-09-30 18:11:28.444 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:11:28 compute-1 nova_compute[238822]: 2025-09-30 18:11:28.445 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:11:27 up  3:48,  0 user,  load average: 0.87, 1.40, 1.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:11:28 compute-1 nova_compute[238822]: 2025-09-30 18:11:28.457 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:11:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:11:28 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/519664168' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:28 compute-1 nova_compute[238822]: 2025-09-30 18:11:28.955 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:11:28 compute-1 nova_compute[238822]: 2025-09-30 18:11:28.964 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:11:29 compute-1 ceph-mon[75484]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/519664168' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:29.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:29 compute-1 sudo[269233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:11:29 compute-1 sudo[269233]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:29 compute-1 sudo[269233]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:29.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:29 compute-1 nova_compute[238822]: 2025-09-30 18:11:29.475 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:11:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:29 compute-1 nova_compute[238822]: 2025-09-30 18:11:29.986 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:11:29 compute-1 nova_compute[238822]: 2025-09-30 18:11:29.987 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:11:30 compute-1 sshd-session[269208]: Invalid user lab from 192.210.160.141 port 54768
Sep 30 18:11:30 compute-1 sshd-session[269208]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:30 compute-1 sshd-session[269208]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:11:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:30 compute-1 nova_compute[238822]: 2025-09-30 18:11:30.984 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:30 compute-1 nova_compute[238822]: 2025-09-30 18:11:30.984 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:30 compute-1 nova_compute[238822]: 2025-09-30 18:11:30.985 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:11:31 compute-1 ceph-mon[75484]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:31.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:31 compute-1 ovn_controller[135204]: 2025-09-30T18:11:31Z|00048|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 18:11:31 compute-1 nova_compute[238822]: 2025-09-30 18:11:31.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:31 compute-1 sshd-session[269208]: Failed password for invalid user lab from 192.210.160.141 port 54768 ssh2
Sep 30 18:11:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:32 compute-1 sshd-session[269208]: Connection closed by invalid user lab 192.210.160.141 port 54768 [preauth]
Sep 30 18:11:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:32 compute-1 nova_compute[238822]: 2025-09-30 18:11:32.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:33 compute-1 ceph-mon[75484]: pgmap v981: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:33.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:33.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:35 compute-1 ceph-mon[75484]: pgmap v982: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:35.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:35.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:35 compute-1 podman[249638]: time="2025-09-30T18:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:11:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:11:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8808 "" "Go-http-client/1.1"
Sep 30 18:11:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:36 compute-1 nova_compute[238822]: 2025-09-30 18:11:36.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:36 compute-1 sudo[269267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:11:36 compute-1 sudo[269267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:36 compute-1 sudo[269267]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:36 compute-1 sudo[269292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:11:36 compute-1 sudo[269292]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:37.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:37 compute-1 ceph-mon[75484]: pgmap v983: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1090999453' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:11:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1090999453' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:11:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:11:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:37.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:11:37 compute-1 podman[269392]: 2025-09-30 18:11:37.480842957 +0000 UTC m=+0.090747078 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Sep 30 18:11:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:37 compute-1 podman[269392]: 2025-09-30 18:11:37.647093239 +0000 UTC m=+0.256997330 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 18:11:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:37 compute-1 nova_compute[238822]: 2025-09-30 18:11:37.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:11:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:11:38 compute-1 podman[269513]: 2025-09-30 18:11:38.381152934 +0000 UTC m=+0.104826310 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:11:38 compute-1 podman[269513]: 2025-09-30 18:11:38.398606356 +0000 UTC m=+0.122279682 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:11:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:38 compute-1 podman[269603]: 2025-09-30 18:11:38.930856368 +0000 UTC m=+0.068628069 container exec 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 18:11:38 compute-1 podman[269603]: 2025-09-30 18:11:38.947343884 +0000 UTC m=+0.085115585 container exec_died 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 18:11:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:39.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:39 compute-1 ceph-mon[75484]: pgmap v984: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:39 compute-1 podman[269667]: 2025-09-30 18:11:39.197390285 +0000 UTC m=+0.073383678 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:11:39 compute-1 podman[269667]: 2025-09-30 18:11:39.212098283 +0000 UTC m=+0.088091676 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:11:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:39 compute-1 podman[269735]: 2025-09-30 18:11:39.48195596 +0000 UTC m=+0.071306602 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.openshift.expose-services=, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived)
Sep 30 18:11:39 compute-1 podman[269735]: 2025-09-30 18:11:39.496081312 +0000 UTC m=+0.085431924 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, release=1793, io.openshift.expose-services=, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public)
Sep 30 18:11:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:39 compute-1 sudo[269292]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:39 compute-1 sudo[269804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:11:39 compute-1 sudo[269804]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:39 compute-1 sudo[269804]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:39 compute-1 sudo[269829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:11:39 compute-1 sudo[269829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:40 compute-1 sudo[269829]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:40 compute-1 sudo[269889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:11:40 compute-1 sudo[269889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:40 compute-1 sudo[269889]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:40 compute-1 ceph-mon[75484]: pgmap v985: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/25730032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:11:40 compute-1 sudo[269914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 18:11:40 compute-1 sudo[269914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:41.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:41 compute-1 sudo[269914]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:41.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:41 compute-1 nova_compute[238822]: 2025-09-30 18:11:41.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:11:42 compute-1 ceph-mon[75484]: pgmap v986: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:42 compute-1 nova_compute[238822]: 2025-09-30 18:11:42.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:43.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:43.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:43 compute-1 podman[269962]: 2025-09-30 18:11:43.54768439 +0000 UTC m=+0.078296641 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:11:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:43 compute-1 podman[269961]: 2025-09-30 18:11:43.563478028 +0000 UTC m=+0.097693506 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:11:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:43 compute-1 sshd-session[269958]: Invalid user vastbase from 14.225.167.110 port 57394
Sep 30 18:11:43 compute-1 sshd-session[269958]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:43 compute-1 sshd-session[269958]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:11:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:45 compute-1 ceph-mon[75484]: pgmap v987: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:11:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:11:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:45.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:11:45 compute-1 sshd-session[269958]: Failed password for invalid user vastbase from 14.225.167.110 port 57394 ssh2
Sep 30 18:11:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:45.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:46 compute-1 sshd-session[269958]: Received disconnect from 14.225.167.110 port 57394:11: Bye Bye [preauth]
Sep 30 18:11:46 compute-1 sshd-session[269958]: Disconnected from invalid user vastbase 14.225.167.110 port 57394 [preauth]
Sep 30 18:11:46 compute-1 sudo[270015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:11:46 compute-1 sudo[270015]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:46 compute-1 sudo[270015]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:46 compute-1 nova_compute[238822]: 2025-09-30 18:11:46.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:47.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:47 compute-1 ceph-mon[75484]: pgmap v988: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:11:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:47 compute-1 nova_compute[238822]: 2025-09-30 18:11:47.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:49.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:49 compute-1 ceph-mon[75484]: pgmap v989: 353 pgs: 353 active+clean; 41 MiB data, 191 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:11:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3006239968' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:11:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/470287802' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:11:49 compute-1 sudo[270043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:11:49 compute-1 sudo[270043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:11:49 compute-1 sudo[270043]: pam_unix(sudo:session): session closed for user root
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: ERROR   18:11:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: ERROR   18:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: ERROR   18:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: ERROR   18:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: ERROR   18:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:11:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:11:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:49.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:50 compute-1 ceph-mon[75484]: pgmap v990: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:11:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980040a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:50 compute-1 podman[270069]: 2025-09-30 18:11:50.553563186 +0000 UTC m=+0.090489221 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:11:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:51.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:51.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:51 compute-1 nova_compute[238822]: 2025-09-30 18:11:51.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:52 compute-1 nova_compute[238822]: 2025-09-30 18:11:52.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:53 compute-1 ceph-mon[75484]: pgmap v991: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:11:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:11:53 compute-1 unix_chkpwd[270093]: password check failed for user (root)
Sep 30 18:11:53 compute-1 sshd-session[270089]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:11:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:53.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:53.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:53 compute-1 sshd-session[270094]: Invalid user solana from 45.148.10.240 port 54840
Sep 30 18:11:54 compute-1 sshd-session[270094]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:54 compute-1 sshd-session[270094]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.148.10.240
Sep 30 18:11:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:54.351 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:11:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:54.353 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:11:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:11:54.353 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:11:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980040c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:54 compute-1 sshd-session[270089]: Failed password for root from 192.210.160.141 port 46686 ssh2
Sep 30 18:11:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:54 compute-1 unix_chkpwd[270101]: password check failed for user (root)
Sep 30 18:11:54 compute-1 sshd-session[270096]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:11:55 compute-1 ceph-mon[75484]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:11:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:55.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:55.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:56 compute-1 sshd-session[270094]: Failed password for invalid user solana from 45.148.10.240 port 54840 ssh2
Sep 30 18:11:56 compute-1 sshd-session[270089]: Connection closed by authenticating user root 192.210.160.141 port 46686 [preauth]
Sep 30 18:11:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:56 compute-1 podman[270105]: 2025-09-30 18:11:56.541216623 +0000 UTC m=+0.070127099 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 18:11:56 compute-1 podman[270104]: 2025-09-30 18:11:56.549678662 +0000 UTC m=+0.088349043 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Sep 30 18:11:56 compute-1 podman[270103]: 2025-09-30 18:11:56.553918007 +0000 UTC m=+0.090251114 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 18:11:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:56 compute-1 nova_compute[238822]: 2025-09-30 18:11:56.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:56 compute-1 sshd-session[270096]: Failed password for root from 216.10.242.161 port 43934 ssh2
Sep 30 18:11:57 compute-1 ceph-mon[75484]: pgmap v993: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:11:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:57.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:57.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:57 compute-1 sshd-session[270096]: Received disconnect from 216.10.242.161 port 43934:11: Bye Bye [preauth]
Sep 30 18:11:57 compute-1 sshd-session[270096]: Disconnected from authenticating user root 216.10.242.161 port 43934 [preauth]
Sep 30 18:11:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:57 compute-1 nova_compute[238822]: 2025-09-30 18:11:57.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:11:58 compute-1 sshd-session[270094]: Connection closed by invalid user solana 45.148.10.240 port 54840 [preauth]
Sep 30 18:11:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/564030798' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:11:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/564030798' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:11:58 compute-1 sshd-session[270166]: Invalid user ubuntu from 107.172.146.104 port 50732
Sep 30 18:11:58 compute-1 sshd-session[270166]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:11:58 compute-1 sshd-session[270166]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:11:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980040e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:11:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:11:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:11:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:11:59.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:11:59 compute-1 ceph-mon[75484]: pgmap v994: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:11:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:11:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:11:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:11:59.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:11:59 compute-1 nova_compute[238822]: 2025-09-30 18:11:59.589 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:11:59 compute-1 nova_compute[238822]: 2025-09-30 18:11:59.590 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:11:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:11:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:11:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:11:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:11:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:00 compute-1 nova_compute[238822]: 2025-09-30 18:12:00.097 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:12:00 compute-1 ceph-mon[75484]: pgmap v995: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:12:00 compute-1 sshd-session[270166]: Failed password for invalid user ubuntu from 107.172.146.104 port 50732 ssh2
Sep 30 18:12:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:00 compute-1 nova_compute[238822]: 2025-09-30 18:12:00.649 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:00 compute-1 nova_compute[238822]: 2025-09-30 18:12:00.650 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:00 compute-1 nova_compute[238822]: 2025-09-30 18:12:00.659 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:12:00 compute-1 nova_compute[238822]: 2025-09-30 18:12:00.660 2 INFO nova.compute.claims [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:12:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:01.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:01.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:01 compute-1 nova_compute[238822]: 2025-09-30 18:12:01.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:01 compute-1 nova_compute[238822]: 2025-09-30 18:12:01.707 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:01 compute-1 sshd-session[270166]: Received disconnect from 107.172.146.104 port 50732:11: Bye Bye [preauth]
Sep 30 18:12:01 compute-1 sshd-session[270166]: Disconnected from invalid user ubuntu 107.172.146.104 port 50732 [preauth]
Sep 30 18:12:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:12:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/95796165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:02 compute-1 nova_compute[238822]: 2025-09-30 18:12:02.201 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:02 compute-1 nova_compute[238822]: 2025-09-30 18:12:02.209 2 DEBUG nova.compute.provider_tree [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:12:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:02 compute-1 nova_compute[238822]: 2025-09-30 18:12:02.719 2 DEBUG nova.scheduler.client.report [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:12:02 compute-1 nova_compute[238822]: 2025-09-30 18:12:02.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:03 compute-1 ceph-mon[75484]: pgmap v996: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:12:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/95796165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:03.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.231 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.582s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.233 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:12:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:03.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:03 compute-1 unix_chkpwd[270197]: password check failed for user (root)
Sep 30 18:12:03 compute-1 sshd-session[270194]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:12:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.745 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.746 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.746 2 WARNING neutronclient.v2_0.client [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:12:03 compute-1 nova_compute[238822]: 2025-09-30 18:12:03.747 2 WARNING neutronclient.v2_0.client [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:12:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:04.141 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:12:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:04.142 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:12:04 compute-1 nova_compute[238822]: 2025-09-30 18:12:04.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:04 compute-1 nova_compute[238822]: 2025-09-30 18:12:04.252 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Successfully created port: 93146cef-46ad-4383-892d-3ec355af507c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:12:04 compute-1 nova_compute[238822]: 2025-09-30 18:12:04.258 2 INFO nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:12:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:04 compute-1 nova_compute[238822]: 2025-09-30 18:12:04.768 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.030 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Successfully updated port: 93146cef-46ad-4383-892d-3ec355af507c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:12:05 compute-1 ceph-mon[75484]: pgmap v997: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.107 2 DEBUG nova.compute.manager [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-changed-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.108 2 DEBUG nova.compute.manager [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Refreshing instance network info cache due to event network-changed-93146cef-46ad-4383-892d-3ec355af507c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.108 2 DEBUG oslo_concurrency.lockutils [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.109 2 DEBUG oslo_concurrency.lockutils [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.109 2 DEBUG nova.network.neutron [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Refreshing network info cache for port 93146cef-46ad-4383-892d-3ec355af507c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:12:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:05.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:05.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.538 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.618 2 WARNING neutronclient.v2_0.client [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:12:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:05 compute-1 podman[249638]: time="2025-09-30T18:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:12:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:12:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8794 "" "Go-http-client/1.1"
Sep 30 18:12:05 compute-1 sshd-session[270194]: Failed password for root from 175.126.165.170 port 39656 ssh2
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.798 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.800 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.801 2 INFO nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Creating image(s)
Sep 30 18:12:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.842 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.887 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.930 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:05 compute-1 nova_compute[238822]: 2025-09-30 18:12:05.936 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.029 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.031 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.032 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.032 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.076 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.085 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 dadc55d4-1578-4dc1-880a-08098fba63ea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.427 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 dadc55d4-1578-4dc1-880a-08098fba63ea_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.342s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:06 compute-1 sshd-session[270194]: Received disconnect from 175.126.165.170 port 39656:11: Bye Bye [preauth]
Sep 30 18:12:06 compute-1 sshd-session[270194]: Disconnected from authenticating user root 175.126.165.170 port 39656 [preauth]
Sep 30 18:12:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.543 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] resizing rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:12:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.685 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.685 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Ensure instance console log exists: /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.686 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.689 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.689 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:06 compute-1 nova_compute[238822]: 2025-09-30 18:12:06.700 2 DEBUG nova.network.neutron [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:12:07 compute-1 ceph-mon[75484]: pgmap v998: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 68 op/s
Sep 30 18:12:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:07.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:07 compute-1 nova_compute[238822]: 2025-09-30 18:12:07.341 2 DEBUG nova.network.neutron [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:12:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:07.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:07 compute-1 nova_compute[238822]: 2025-09-30 18:12:07.851 2 DEBUG oslo_concurrency.lockutils [req-4579abc6-efda-4789-af67-d91ec8403ad5 req-899b18b3-220b-4db3-988d-fab41478d137 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:12:07 compute-1 nova_compute[238822]: 2025-09-30 18:12:07.852 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquired lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:12:07 compute-1 nova_compute[238822]: 2025-09-30 18:12:07.852 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:12:07 compute-1 nova_compute[238822]: 2025-09-30 18:12:07.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:12:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:08 compute-1 nova_compute[238822]: 2025-09-30 18:12:08.649 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.004 2 WARNING neutronclient.v2_0.client [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:12:09 compute-1 unix_chkpwd[270375]: password check failed for user (root)
Sep 30 18:12:09 compute-1 sshd-session[270372]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:12:09 compute-1 ceph-mon[75484]: pgmap v999: 353 pgs: 353 active+clean; 88 MiB data, 209 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 68 op/s
Sep 30 18:12:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:09.144 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.176 2 DEBUG nova.network.neutron [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Updating instance_info_cache with network_info: [{"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:12:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:09.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:09 compute-1 sudo[270376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:12:09 compute-1 sudo[270376]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:09 compute-1 sudo[270376]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:09.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.684 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Releasing lock "refresh_cache-dadc55d4-1578-4dc1-880a-08098fba63ea" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.685 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance network_info: |[{"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.689 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Start _get_guest_xml network_info=[{"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.695 2 WARNING nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.697 2 DEBUG nova.virt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1080774082', uuid='dadc55d4-1578-4dc1-880a-08098fba63ea'), owner=OwnerMeta(userid='dc3bb71c425f484fbc46f90978029403', username='tempest-TestExecuteActionsViaActuator-837729328-project-admin', projectid='ddd1f985d8b64b449c79d55b0cbd6422', projectname='tempest-TestExecuteActionsViaActuator-837729328'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759255929.6977932) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.705 2 DEBUG nova.virt.libvirt.host [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.707 2 DEBUG nova.virt.libvirt.host [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.711 2 DEBUG nova.virt.libvirt.host [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.712 2 DEBUG nova.virt.libvirt.host [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.713 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.713 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.714 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.715 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.715 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.716 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.716 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.717 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.717 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.717 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.718 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.718 2 DEBUG nova.virt.hardware [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:12:09 compute-1 nova_compute[238822]: 2025-09-30 18:12:09.723 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:12:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3709444185' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.207 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.250 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.256 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:12:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3388835610' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:10 compute-1 sshd-session[270372]: Failed password for root from 194.107.115.65 port 55706 ssh2
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.735 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.737 2 DEBUG nova.virt.libvirt.vif [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:11:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1080774082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1080774082',id=5,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-3aq7na6m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:12:04Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=dadc55d4-1578-4dc1-880a-08098fba63ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.737 2 DEBUG nova.network.os_vif_util [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.738 2 DEBUG nova.network.os_vif_util [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:12:10 compute-1 nova_compute[238822]: 2025-09-30 18:12:10.739 2 DEBUG nova.objects.instance [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'pci_devices' on Instance uuid dadc55d4-1578-4dc1-880a-08098fba63ea obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:12:11 compute-1 ceph-mon[75484]: pgmap v1000: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 157 op/s
Sep 30 18:12:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3709444185' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3388835610' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:11.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.248 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <uuid>dadc55d4-1578-4dc1-880a-08098fba63ea</uuid>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <name>instance-00000005</name>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1080774082</nova:name>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:12:09</nova:creationTime>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:12:11 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:12:11 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:user uuid="dc3bb71c425f484fbc46f90978029403">tempest-TestExecuteActionsViaActuator-837729328-project-admin</nova:user>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:project uuid="ddd1f985d8b64b449c79d55b0cbd6422">tempest-TestExecuteActionsViaActuator-837729328</nova:project>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <nova:port uuid="93146cef-46ad-4383-892d-3ec355af507c">
Sep 30 18:12:11 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <system>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="serial">dadc55d4-1578-4dc1-880a-08098fba63ea</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="uuid">dadc55d4-1578-4dc1-880a-08098fba63ea</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </system>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <os>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </os>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <features>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </features>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/dadc55d4-1578-4dc1-880a-08098fba63ea_disk">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </source>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </source>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:12:11 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:0a:61:22"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <target dev="tap93146cef-46"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/console.log" append="off"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <video>
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </video>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:12:11 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:12:11 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:12:11 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:12:11 compute-1 nova_compute[238822]: </domain>
Sep 30 18:12:11 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.251 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Preparing to wait for external event network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.251 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.251 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.251 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.252 2 DEBUG nova.virt.libvirt.vif [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:11:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1080774082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1080774082',id=5,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-3aq7na6m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:12:04Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=dadc55d4-1578-4dc1-880a-08098fba63ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.252 2 DEBUG nova.network.os_vif_util [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.253 2 DEBUG nova.network.os_vif_util [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.254 2 DEBUG os_vif [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.256 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e20042dc-3398-520a-be65-71459a4c3337', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.263 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93146cef-46, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap93146cef-46, col_values=(('qos', UUID('43ccb678-8a03-4af0-bc81-7f1c99284116')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap93146cef-46, col_values=(('external_ids', {'iface-id': '93146cef-46ad-4383-892d-3ec355af507c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:61:22', 'vm-uuid': 'dadc55d4-1578-4dc1-880a-08098fba63ea'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 NetworkManager[45549]: <info>  [1759255931.2672] manager: (tap93146cef-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.276 2 INFO os_vif [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46')
Sep 30 18:12:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:11.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:11 compute-1 nova_compute[238822]: 2025-09-30 18:12:11.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:11 compute-1 sshd-session[270372]: Received disconnect from 194.107.115.65 port 55706:11: Bye Bye [preauth]
Sep 30 18:12:11 compute-1 sshd-session[270372]: Disconnected from authenticating user root 194.107.115.65 port 55706 [preauth]
Sep 30 18:12:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:12 compute-1 nova_compute[238822]: 2025-09-30 18:12:12.816 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:12:12 compute-1 nova_compute[238822]: 2025-09-30 18:12:12.817 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:12:12 compute-1 nova_compute[238822]: 2025-09-30 18:12:12.817 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No VIF found with MAC fa:16:3e:0a:61:22, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:12:12 compute-1 nova_compute[238822]: 2025-09-30 18:12:12.819 2 INFO nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Using config drive
Sep 30 18:12:12 compute-1 nova_compute[238822]: 2025-09-30 18:12:12.871 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:13 compute-1 ceph-mon[75484]: pgmap v1001: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 328 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:12:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:13.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.394 2 WARNING neutronclient.v2_0.client [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:12:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:13.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.771 2 INFO nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Creating config drive at /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.782 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp5nysms7u execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.944 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp5nysms7u" returned: 0 in 0.162s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.992 2 DEBUG nova.storage.rbd_utils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:12:13 compute-1 nova_compute[238822]: 2025-09-30 18:12:13.998 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.194 2 DEBUG oslo_concurrency.processutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config dadc55d4-1578-4dc1-880a-08098fba63ea_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.196s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.194 2 INFO nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Deleting local config drive /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea/disk.config because it was imported into RBD.
Sep 30 18:12:14 compute-1 kernel: tap93146cef-46: entered promiscuous mode
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.2658] manager: (tap93146cef-46): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Sep 30 18:12:14 compute-1 ovn_controller[135204]: 2025-09-30T18:12:14Z|00049|binding|INFO|Claiming lport 93146cef-46ad-4383-892d-3ec355af507c for this chassis.
Sep 30 18:12:14 compute-1 ovn_controller[135204]: 2025-09-30T18:12:14Z|00050|binding|INFO|93146cef-46ad-4383-892d-3ec355af507c: Claiming fa:16:3e:0a:61:22 10.100.0.8
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.283 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:61:22 10.100.0.8'], port_security=['fa:16:3e:0a:61:22 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'dadc55d4-1578-4dc1-880a-08098fba63ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=93146cef-46ad-4383-892d-3ec355af507c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.284 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 93146cef-46ad-4383-892d-3ec355af507c in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab bound to our chassis
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.286 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.301 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[67f215d9-7f18-48ae-8f84-679215c261b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.302 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5fff1904-11 in ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.309 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5fff1904-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.309 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[304ffa32-ce68-42c3-93ff-715f616fa200]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.310 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0286a896-8bd7-4968-8da8-0e9b2eddb5d2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 systemd-udevd[270584]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:12:14 compute-1 podman[270530]: 2025-09-30 18:12:14.319302869 +0000 UTC m=+0.098400305 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.326 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[64a599ae-ed30-48f0-9c17-49546569b38c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 systemd-machined[195911]: New machine qemu-2-instance-00000005.
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.3333] device (tap93146cef-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.3345] device (tap93146cef-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.334 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[452fc591-c23c-46b0-8d2d-2c50f3be6f86]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 ovn_controller[135204]: 2025-09-30T18:12:14Z|00051|binding|INFO|Setting lport 93146cef-46ad-4383-892d-3ec355af507c ovn-installed in OVS
Sep 30 18:12:14 compute-1 ovn_controller[135204]: 2025-09-30T18:12:14Z|00052|binding|INFO|Setting lport 93146cef-46ad-4383-892d-3ec355af507c up in Southbound
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 podman[270529]: 2025-09-30 18:12:14.361394929 +0000 UTC m=+0.150176457 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.372 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[6e598851-e752-4ecc-a825-07803f37b865]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.377 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5e29548f-bbeb-4682-8876-117a860a41b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.3780] manager: (tap5fff1904-10): new Veth device (/org/freedesktop/NetworkManager/Devices/30)
Sep 30 18:12:14 compute-1 systemd-udevd[270592]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.416 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5fc421-2a6e-47e7-829d-4f389e42bbb1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.421 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[683ee289-014c-4f4a-94d6-c346873393b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.4497] device (tap5fff1904-10): carrier: link connected
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.461 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a426fc57-b937-4850-bc9c-9c9813c87362]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.485 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d0154fa8-80f0-4c16-ac59-a0d6b1727efa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270623, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.509 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8a394ce2-020e-47f4-aba5-892773c3e185]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:77b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377603, 'tstamp': 1377603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270624, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.536 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[89bd13ba-0cf1-4bee-958f-272527c09bef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 30482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270625, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.579 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7772bfeb-787e-4fcb-a717-0ab0540bcf49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.667 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9a76524d-e4cb-4dc9-bc43-7302228c122f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.669 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.670 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.670 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:14 compute-1 NetworkManager[45549]: <info>  [1759255934.6743] manager: (tap5fff1904-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Sep 30 18:12:14 compute-1 kernel: tap5fff1904-10: entered promiscuous mode
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.677 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:12:14 compute-1 ovn_controller[135204]: 2025-09-30T18:12:14Z|00053|binding|INFO|Releasing lport 3a8ea0a0-c179-4516-9404-04b68a17e79e from this chassis (sb_readonly=0)
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.710 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[662273b5-b986-4343-9d61-7aee4c9730cc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.712 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.712 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.712 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 5fff1904-159a-4b76-8c46-feabf17f29ab disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.713 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.713 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[29053dac-efe5-4d29-819d-dca3f6245a6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.714 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.715 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d5c945-c356-475e-8104-c15ce9f5deba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.716 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:12:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:14.717 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'env', 'PROCESS_TAG=haproxy-5fff1904-159a-4b76-8c46-feabf17f29ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5fff1904-159a-4b76-8c46-feabf17f29ab.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.831 2 DEBUG nova.compute.manager [req-b3f96070-7ab2-4617-9b61-0ebb8467734f req-7325b6ce-87b6-49e2-b3b5-a88d54763e0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.831 2 DEBUG oslo_concurrency.lockutils [req-b3f96070-7ab2-4617-9b61-0ebb8467734f req-7325b6ce-87b6-49e2-b3b5-a88d54763e0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.831 2 DEBUG oslo_concurrency.lockutils [req-b3f96070-7ab2-4617-9b61-0ebb8467734f req-7325b6ce-87b6-49e2-b3b5-a88d54763e0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.831 2 DEBUG oslo_concurrency.lockutils [req-b3f96070-7ab2-4617-9b61-0ebb8467734f req-7325b6ce-87b6-49e2-b3b5-a88d54763e0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:14 compute-1 nova_compute[238822]: 2025-09-30 18:12:14.831 2 DEBUG nova.compute.manager [req-b3f96070-7ab2-4617-9b61-0ebb8467734f req-7325b6ce-87b6-49e2-b3b5-a88d54763e0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Processing event network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:12:15 compute-1 ceph-mon[75484]: pgmap v1002: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 329 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:12:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:15 compute-1 podman[270699]: 2025-09-30 18:12:15.246289398 +0000 UTC m=+0.058419123 container create 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:12:15 compute-1 systemd[1]: Started libpod-conmon-73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e.scope.
Sep 30 18:12:15 compute-1 podman[270699]: 2025-09-30 18:12:15.21609317 +0000 UTC m=+0.028222975 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:12:15 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:12:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec3484ab28fd4f27b738728c98c6e32496c00bfab60b199ced2df9ecb614c481/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:12:15 compute-1 podman[270699]: 2025-09-30 18:12:15.356589194 +0000 UTC m=+0.168718959 container init 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:12:15 compute-1 podman[270699]: 2025-09-30 18:12:15.362114524 +0000 UTC m=+0.174244259 container start 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:12:15 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [NOTICE]   (270720) : New worker (270722) forked
Sep 30 18:12:15 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [NOTICE]   (270720) : Loading success.
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.465 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.470 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.475 2 INFO nova.virt.libvirt.driver [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance spawned successfully.
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.476 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:12:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:15.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.991 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.992 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.993 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.994 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.994 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:15 compute-1 nova_compute[238822]: 2025-09-30 18:12:15.995 2 DEBUG nova.virt.libvirt.driver [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.508 2 INFO nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Took 10.71 seconds to spawn the instance on the hypervisor.
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.509 2 DEBUG nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:12:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.929 2 DEBUG nova.compute.manager [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.930 2 DEBUG oslo_concurrency.lockutils [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.930 2 DEBUG oslo_concurrency.lockutils [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.931 2 DEBUG oslo_concurrency.lockutils [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.932 2 DEBUG nova.compute.manager [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] No waiting events found dispatching network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:12:16 compute-1 nova_compute[238822]: 2025-09-30 18:12:16.932 2 WARNING nova.compute.manager [req-50d4976f-43a3-43c2-ae1d-253810a37654 req-908d3637-daa5-479b-a572-c9611335e9a4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received unexpected event network-vif-plugged-93146cef-46ad-4383-892d-3ec355af507c for instance with vm_state active and task_state None.
Sep 30 18:12:17 compute-1 nova_compute[238822]: 2025-09-30 18:12:17.058 2 INFO nova.compute.manager [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Took 16.45 seconds to build instance.
Sep 30 18:12:17 compute-1 ceph-mon[75484]: pgmap v1003: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 328 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:12:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:17.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:17 compute-1 unix_chkpwd[270734]: password check failed for user (root)
Sep 30 18:12:17 compute-1 sshd-session[270705]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:12:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:17.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:17 compute-1 nova_compute[238822]: 2025-09-30 18:12:17.564 2 DEBUG oslo_concurrency.lockutils [None req-6f6ef5ae-1233-492a-bd06-c9c54e6af8d2 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.974s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:19.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:19 compute-1 sshd-session[270705]: Failed password for root from 192.210.160.141 port 45136 ssh2
Sep 30 18:12:19 compute-1 ceph-mon[75484]: pgmap v1004: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 328 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: ERROR   18:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: ERROR   18:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: ERROR   18:12:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: ERROR   18:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: ERROR   18:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:12:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:12:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:19.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:20 compute-1 nova_compute[238822]: 2025-09-30 18:12:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:20 compute-1 ceph-mon[75484]: pgmap v1005: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 163 op/s
Sep 30 18:12:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:20 compute-1 sshd-session[270705]: Connection closed by authenticating user root 192.210.160.141 port 45136 [preauth]
Sep 30 18:12:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:21.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:21 compute-1 nova_compute[238822]: 2025-09-30 18:12:21.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:21.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:21 compute-1 podman[270739]: 2025-09-30 18:12:21.561312681 +0000 UTC m=+0.088758114 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:12:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:21 compute-1 nova_compute[238822]: 2025-09-30 18:12:21.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:22 compute-1 nova_compute[238822]: 2025-09-30 18:12:22.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:22 compute-1 sshd-session[270760]: Invalid user oracle from 167.71.248.239 port 53632
Sep 30 18:12:22 compute-1 sshd-session[270760]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:22 compute-1 sshd-session[270760]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 18:12:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:23 compute-1 nova_compute[238822]: 2025-09-30 18:12:23.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:23 compute-1 nova_compute[238822]: 2025-09-30 18:12:23.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:12:23 compute-1 ceph-mon[75484]: pgmap v1006: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 27 KiB/s wr, 74 op/s
Sep 30 18:12:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:12:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2867667517' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:23.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:23.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:24 compute-1 sshd-session[270760]: Failed password for invalid user oracle from 167.71.248.239 port 53632 ssh2
Sep 30 18:12:25 compute-1 ceph-mon[75484]: pgmap v1007: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 27 KiB/s wr, 74 op/s
Sep 30 18:12:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:25.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:25 compute-1 sshd-session[270760]: Connection closed by invalid user oracle 167.71.248.239 port 53632 [preauth]
Sep 30 18:12:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:25.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3224065394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.578 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.579 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:26 compute-1 nova_compute[238822]: 2025-09-30 18:12:26.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:27 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:12:27 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3669096758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:27 compute-1 nova_compute[238822]: 2025-09-30 18:12:27.077 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:27 compute-1 ceph-mon[75484]: pgmap v1008: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:12:27 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3669096758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:27 compute-1 podman[270789]: 2025-09-30 18:12:27.201741307 +0000 UTC m=+0.071727623 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:12:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:27 compute-1 podman[270791]: 2025-09-30 18:12:27.234477483 +0000 UTC m=+0.094658344 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 18:12:27 compute-1 podman[270792]: 2025-09-30 18:12:27.240883646 +0000 UTC m=+0.095487676 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:12:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.145 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.146 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.387 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.388 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.431 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.432 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4628MB free_disk=39.92576217651367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.432 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:28 compute-1 nova_compute[238822]: 2025-09-30 18:12:28.433 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:29 compute-1 ceph-mon[75484]: pgmap v1009: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:12:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:29 compute-1 nova_compute[238822]: 2025-09-30 18:12:29.488 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance dadc55d4-1578-4dc1-880a-08098fba63ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:12:29 compute-1 nova_compute[238822]: 2025-09-30 18:12:29.489 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:12:29 compute-1 nova_compute[238822]: 2025-09-30 18:12:29.489 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:12:28 up  3:49,  0 user,  load average: 0.69, 1.23, 1.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_ddd1f985d8b64b449c79d55b0cbd6422': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:12:29 compute-1 sudo[270853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:12:29 compute-1 sudo[270853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:29 compute-1 sudo[270853]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:29 compute-1 nova_compute[238822]: 2025-09-30 18:12:29.530 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:12:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:29.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:12:29 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/433824411' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:30 compute-1 nova_compute[238822]: 2025-09-30 18:12:30.013 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:12:30 compute-1 nova_compute[238822]: 2025-09-30 18:12:30.023 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:12:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/433824411' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1422138412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:12:30 compute-1 ovn_controller[135204]: 2025-09-30T18:12:30Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:61:22 10.100.0.8
Sep 30 18:12:30 compute-1 ovn_controller[135204]: 2025-09-30T18:12:30Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:61:22 10.100.0.8
Sep 30 18:12:30 compute-1 nova_compute[238822]: 2025-09-30 18:12:30.533 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:12:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:31 compute-1 nova_compute[238822]: 2025-09-30 18:12:31.045 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:12:31 compute-1 nova_compute[238822]: 2025-09-30 18:12:31.045 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.613s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:31 compute-1 ceph-mon[75484]: pgmap v1010: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 75 op/s
Sep 30 18:12:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:31 compute-1 nova_compute[238822]: 2025-09-30 18:12:31.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:31.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:31 compute-1 nova_compute[238822]: 2025-09-30 18:12:31.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:33 compute-1 nova_compute[238822]: 2025-09-30 18:12:33.047 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:33 compute-1 nova_compute[238822]: 2025-09-30 18:12:33.047 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:33 compute-1 nova_compute[238822]: 2025-09-30 18:12:33.048 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:33 compute-1 nova_compute[238822]: 2025-09-30 18:12:33.048 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:12:33 compute-1 ceph-mon[75484]: pgmap v1011: 353 pgs: 353 active+clean; 167 MiB data, 256 MiB used, 40 GiB / 40 GiB avail; 3.5 KiB/s rd, 1023 B/s wr, 1 op/s
Sep 30 18:12:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:33.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:33.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:35 compute-1 ceph-mon[75484]: pgmap v1012: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 271 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:12:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:35.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:35.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:35 compute-1 podman[249638]: time="2025-09-30T18:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:12:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:12:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9273 "" "Go-http-client/1.1"
Sep 30 18:12:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:36 compute-1 nova_compute[238822]: 2025-09-30 18:12:36.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:36 compute-1 nova_compute[238822]: 2025-09-30 18:12:36.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:37 compute-1 ceph-mon[75484]: pgmap v1013: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 271 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:12:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3554528154' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:12:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3554528154' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:12:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:37.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:37.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:37 compute-1 sshd-session[270910]: Invalid user user from 84.51.43.58 port 41799
Sep 30 18:12:37 compute-1 sshd-session[270910]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:37 compute-1 sshd-session[270910]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:12:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:12:38 compute-1 ceph-mon[75484]: pgmap v1014: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 271 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:12:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:38 compute-1 sshd-session[270912]: Invalid user ca from 103.153.190.105 port 60382
Sep 30 18:12:38 compute-1 sshd-session[270912]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:38 compute-1 sshd-session[270912]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:12:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:39.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:39 compute-1 sshd-session[270910]: Failed password for invalid user user from 84.51.43.58 port 41799 ssh2
Sep 30 18:12:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:39.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:39 compute-1 sshd-session[270910]: Received disconnect from 84.51.43.58 port 41799:11: Bye Bye [preauth]
Sep 30 18:12:39 compute-1 sshd-session[270910]: Disconnected from invalid user user 84.51.43.58 port 41799 [preauth]
Sep 30 18:12:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:40 compute-1 sshd-session[270912]: Failed password for invalid user ca from 103.153.190.105 port 60382 ssh2
Sep 30 18:12:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:41 compute-1 ceph-mon[75484]: pgmap v1015: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 274 KiB/s rd, 2.2 MiB/s wr, 65 op/s
Sep 30 18:12:41 compute-1 sshd-session[270912]: Received disconnect from 103.153.190.105 port 60382:11: Bye Bye [preauth]
Sep 30 18:12:41 compute-1 sshd-session[270912]: Disconnected from invalid user ca 103.153.190.105 port 60382 [preauth]
Sep 30 18:12:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:41 compute-1 nova_compute[238822]: 2025-09-30 18:12:41.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:41.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:41 compute-1 nova_compute[238822]: 2025-09-30 18:12:41.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:43 compute-1 unix_chkpwd[270923]: password check failed for user (root)
Sep 30 18:12:43 compute-1 sshd-session[270919]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:12:43 compute-1 ceph-mon[75484]: pgmap v1016: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 270 KiB/s rd, 2.2 MiB/s wr, 64 op/s
Sep 30 18:12:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:43.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:44 compute-1 podman[270926]: 2025-09-30 18:12:44.561093386 +0000 UTC m=+0.095834856 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:12:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:44 compute-1 podman[270925]: 2025-09-30 18:12:44.609458566 +0000 UTC m=+0.141218685 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 18:12:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:45 compute-1 sshd-session[270919]: Failed password for root from 192.210.160.141 port 54946 ssh2
Sep 30 18:12:45 compute-1 ceph-mon[75484]: pgmap v1017: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 271 KiB/s rd, 2.2 MiB/s wr, 64 op/s
Sep 30 18:12:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e138 e138: 2 total, 2 up, 2 in
Sep 30 18:12:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:45.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:45.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:46 compute-1 sshd-session[270919]: Connection closed by authenticating user root 192.210.160.141 port 54946 [preauth]
Sep 30 18:12:46 compute-1 ceph-mon[75484]: osdmap e138: 2 total, 2 up, 2 in
Sep 30 18:12:46 compute-1 nova_compute[238822]: 2025-09-30 18:12:46.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:46 compute-1 sudo[270977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:12:46 compute-1 sudo[270977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:46 compute-1 sudo[270977]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:46 compute-1 sudo[271002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:12:46 compute-1 sudo[271002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:46 compute-1 nova_compute[238822]: 2025-09-30 18:12:46.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:47 compute-1 sudo[271002]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:47 compute-1 ceph-mon[75484]: pgmap v1019: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 3.2 KiB/s rd, 33 KiB/s wr, 4 op/s
Sep 30 18:12:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1649060969' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3300006370' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:12:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:47.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:47.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:48 compute-1 ceph-mon[75484]: pgmap v1020: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 3.2 KiB/s rd, 33 KiB/s wr, 4 op/s
Sep 30 18:12:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:49.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: ERROR   18:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: ERROR   18:12:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: ERROR   18:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: ERROR   18:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: ERROR   18:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:12:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:12:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:49.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:49 compute-1 sudo[271063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:12:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:49 compute-1 sudo[271063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:49 compute-1 sudo[271063]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:51 compute-1 ceph-mon[75484]: pgmap v1021: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 15 KiB/s rd, 1.6 KiB/s wr, 18 op/s
Sep 30 18:12:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:51.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:51 compute-1 nova_compute[238822]: 2025-09-30 18:12:51.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:51 compute-1 nova_compute[238822]: 2025-09-30 18:12:51.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:51 compute-1 sshd-session[271092]: Invalid user wikijs from 107.172.146.104 port 42252
Sep 30 18:12:51 compute-1 sshd-session[271092]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:51 compute-1 sshd-session[271092]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:12:52 compute-1 podman[271094]: 2025-09-30 18:12:52.06997216 +0000 UTC m=+0.099124684 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:12:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:12:52 compute-1 sshd-session[271090]: Invalid user elena from 14.225.167.110 port 50018
Sep 30 18:12:52 compute-1 sshd-session[271090]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:52 compute-1 sshd-session[271090]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:12:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:53 compute-1 ceph-mon[75484]: pgmap v1022: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 15 KiB/s rd, 1.6 KiB/s wr, 18 op/s
Sep 30 18:12:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:12:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:53.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:53.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:53 compute-1 sshd-session[271092]: Failed password for invalid user wikijs from 107.172.146.104 port 42252 ssh2
Sep 30 18:12:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:54 compute-1 sshd-session[271092]: Received disconnect from 107.172.146.104 port 42252:11: Bye Bye [preauth]
Sep 30 18:12:54 compute-1 sshd-session[271092]: Disconnected from invalid user wikijs 107.172.146.104 port 42252 [preauth]
Sep 30 18:12:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:54.355 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:12:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:54.355 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:12:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:12:54.356 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:12:54 compute-1 sshd-session[271090]: Failed password for invalid user elena from 14.225.167.110 port 50018 ssh2
Sep 30 18:12:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:55 compute-1 ceph-mon[75484]: pgmap v1023: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 1.7 KiB/s wr, 104 op/s
Sep 30 18:12:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:55.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:55.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:56 compute-1 sshd-session[271090]: Received disconnect from 14.225.167.110 port 50018:11: Bye Bye [preauth]
Sep 30 18:12:56 compute-1 sshd-session[271090]: Disconnected from invalid user elena 14.225.167.110 port 50018 [preauth]
Sep 30 18:12:56 compute-1 nova_compute[238822]: 2025-09-30 18:12:56.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:56 compute-1 sudo[271121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:12:56 compute-1 sudo[271121]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:12:56 compute-1 sudo[271121]: pam_unix(sudo:session): session closed for user root
Sep 30 18:12:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:56 compute-1 nova_compute[238822]: 2025-09-30 18:12:56.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:12:56 compute-1 sshd-session[271118]: Invalid user flutter from 216.10.242.161 port 35848
Sep 30 18:12:56 compute-1 sshd-session[271118]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:12:56 compute-1 sshd-session[271118]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:12:57 compute-1 ceph-mon[75484]: pgmap v1024: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.6 KiB/s wr, 95 op/s
Sep 30 18:12:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:12:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:12:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:57.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:12:57 compute-1 podman[271147]: 2025-09-30 18:12:57.566205304 +0000 UTC m=+0.101994372 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:12:57 compute-1 podman[271149]: 2025-09-30 18:12:57.566520803 +0000 UTC m=+0.097592104 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd)
Sep 30 18:12:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:57.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:57 compute-1 podman[271148]: 2025-09-30 18:12:57.600131973 +0000 UTC m=+0.127071692 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm)
Sep 30 18:12:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2881353993' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:12:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2881353993' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:12:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e139 e139: 2 total, 2 up, 2 in
Sep 30 18:12:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:12:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:12:58 compute-1 sshd-session[271118]: Failed password for invalid user flutter from 216.10.242.161 port 35848 ssh2
Sep 30 18:12:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:59 compute-1 ceph-mon[75484]: pgmap v1025: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.4 KiB/s wr, 87 op/s
Sep 30 18:12:59 compute-1 ceph-mon[75484]: osdmap e139: 2 total, 2 up, 2 in
Sep 30 18:12:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:12:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:12:59.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:12:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:12:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:12:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:12:59.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:12:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:12:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:12:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:12:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:12:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:00 compute-1 sshd-session[271118]: Received disconnect from 216.10.242.161 port 35848:11: Bye Bye [preauth]
Sep 30 18:13:00 compute-1 sshd-session[271118]: Disconnected from invalid user flutter 216.10.242.161 port 35848 [preauth]
Sep 30 18:13:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2916960512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:01 compute-1 ceph-mon[75484]: pgmap v1027: 353 pgs: 2 active+clean+snaptrim, 10 active+clean+snaptrim_wait, 341 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 614 B/s wr, 96 op/s
Sep 30 18:13:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:01.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:01 compute-1 nova_compute[238822]: 2025-09-30 18:13:01.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:01.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:01 compute-1 nova_compute[238822]: 2025-09-30 18:13:01.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3301438619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:02 compute-1 ceph-mon[75484]: pgmap v1028: 353 pgs: 2 active+clean+snaptrim, 10 active+clean+snaptrim_wait, 341 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 614 B/s wr, 96 op/s
Sep 30 18:13:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:03.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:03.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:04 compute-1 sshd[170789]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 268992
Sep 30 18:13:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:05 compute-1 ceph-mon[75484]: pgmap v1029: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 645 KiB/s rd, 17 KiB/s wr, 65 op/s
Sep 30 18:13:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:05.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 e140: 2 total, 2 up, 2 in
Sep 30 18:13:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:05.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:05 compute-1 podman[249638]: time="2025-09-30T18:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:13:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:13:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9277 "" "Go-http-client/1.1"
Sep 30 18:13:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:06 compute-1 nova_compute[238822]: 2025-09-30 18:13:06.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:06 compute-1 ceph-mon[75484]: osdmap e140: 2 total, 2 up, 2 in
Sep 30 18:13:06 compute-1 ceph-mon[75484]: pgmap v1031: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 806 KiB/s rd, 21 KiB/s wr, 82 op/s
Sep 30 18:13:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:06 compute-1 nova_compute[238822]: 2025-09-30 18:13:06.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:07.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:07.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:13:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:08 compute-1 sshd-session[271211]: Invalid user a from 192.210.160.141 port 39836
Sep 30 18:13:08 compute-1 sshd-session[271211]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:13:08 compute-1 sshd-session[271211]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:13:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094003fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3477657276' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:08 compute-1 ceph-mon[75484]: pgmap v1032: 353 pgs: 353 active+clean; 200 MiB data, 285 MiB used, 40 GiB / 40 GiB avail; 655 KiB/s rd, 17 KiB/s wr, 65 op/s
Sep 30 18:13:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2651128805' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:09.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:09.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:09 compute-1 sudo[271219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:13:09 compute-1 sudo[271219]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:13:09 compute-1 sudo[271219]: pam_unix(sudo:session): session closed for user root
Sep 30 18:13:09 compute-1 sshd-session[271211]: Failed password for invalid user a from 192.210.160.141 port 39836 ssh2
Sep 30 18:13:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:10 compute-1 sshd-session[271211]: Connection closed by invalid user a 192.210.160.141 port 39836 [preauth]
Sep 30 18:13:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:11 compute-1 ceph-mon[75484]: pgmap v1033: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 658 KiB/s rd, 2.2 MiB/s wr, 89 op/s
Sep 30 18:13:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:11.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:11 compute-1 nova_compute[238822]: 2025-09-30 18:13:11.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:11 compute-1 sshd-session[271245]: Invalid user elemental from 175.126.165.170 port 48736
Sep 30 18:13:11 compute-1 sshd-session[271245]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:13:11 compute-1 sshd-session[271245]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:13:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:11.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:11 compute-1 nova_compute[238822]: 2025-09-30 18:13:11.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:12.763 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:13:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:12.764 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:13:12 compute-1 nova_compute[238822]: 2025-09-30 18:13:12.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:13 compute-1 ceph-mon[75484]: pgmap v1034: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 658 KiB/s rd, 2.2 MiB/s wr, 89 op/s
Sep 30 18:13:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:13:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:13.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:13:13 compute-1 sshd-session[271245]: Failed password for invalid user elemental from 175.126.165.170 port 48736 ssh2
Sep 30 18:13:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:13.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:13 compute-1 sshd-session[271245]: Received disconnect from 175.126.165.170 port 48736:11: Bye Bye [preauth]
Sep 30 18:13:13 compute-1 sshd-session[271245]: Disconnected from invalid user elemental 175.126.165.170 port 48736 [preauth]
Sep 30 18:13:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:15 compute-1 sshd-session[271251]: Invalid user reelforge from 113.249.93.94 port 46752
Sep 30 18:13:15 compute-1 sshd-session[271251]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:13:15 compute-1 sshd-session[271251]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94
Sep 30 18:13:15 compute-1 ceph-mon[75484]: pgmap v1035: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 24 KiB/s rd, 2.2 MiB/s wr, 39 op/s
Sep 30 18:13:15 compute-1 podman[271256]: 2025-09-30 18:13:15.196494452 +0000 UTC m=+0.077907031 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:13:15 compute-1 podman[271255]: 2025-09-30 18:13:15.246448084 +0000 UTC m=+0.126748193 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 18:13:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:15.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:15.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:16 compute-1 nova_compute[238822]: 2025-09-30 18:13:16.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:16 compute-1 nova_compute[238822]: 2025-09-30 18:13:16.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:16 compute-1 sshd-session[271251]: Failed password for invalid user reelforge from 113.249.93.94 port 46752 ssh2
Sep 30 18:13:17 compute-1 ceph-mon[75484]: pgmap v1036: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 2.1 MiB/s wr, 38 op/s
Sep 30 18:13:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:17.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:17 compute-1 sshd-session[271305]: Invalid user jenkins from 194.107.115.65 port 23678
Sep 30 18:13:17 compute-1 sshd-session[271305]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:13:17 compute-1 sshd-session[271305]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:13:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:17.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:18 compute-1 sshd-session[271251]: Received disconnect from 113.249.93.94 port 46752:11: Bye Bye [preauth]
Sep 30 18:13:18 compute-1 sshd-session[271251]: Disconnected from invalid user reelforge 113.249.93.94 port 46752 [preauth]
Sep 30 18:13:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:19 compute-1 sshd-session[271305]: Failed password for invalid user jenkins from 194.107.115.65 port 23678 ssh2
Sep 30 18:13:19 compute-1 ceph-mon[75484]: pgmap v1037: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 33 op/s
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.204455) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999204533, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2108, "num_deletes": 502, "total_data_size": 4647405, "memory_usage": 4714856, "flush_reason": "Manual Compaction"}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999216919, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2725194, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29213, "largest_seqno": 31316, "table_properties": {"data_size": 2717285, "index_size": 4211, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 20928, "raw_average_key_size": 20, "raw_value_size": 2699017, "raw_average_value_size": 2590, "num_data_blocks": 185, "num_entries": 1042, "num_filter_entries": 1042, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759255846, "oldest_key_time": 1759255846, "file_creation_time": 1759255999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 12499 microseconds, and 6292 cpu microseconds.
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.216966) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2725194 bytes OK
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.216985) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.218822) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.218834) EVENT_LOG_v1 {"time_micros": 1759255999218830, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.218855) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 4636923, prev total WAL file size 4636923, number of live WAL files 2.
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.219786) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2661KB)], [57(13MB)]
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999219819, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 16549611, "oldest_snapshot_seqno": -1}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5543 keys, 11017023 bytes, temperature: kUnknown
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999273377, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 11017023, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10981571, "index_size": 20494, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13893, "raw_key_size": 142964, "raw_average_key_size": 25, "raw_value_size": 10882737, "raw_average_value_size": 1963, "num_data_blocks": 827, "num_entries": 5543, "num_filter_entries": 5543, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759255999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.273847) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 11017023 bytes
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.276241) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 308.3 rd, 205.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.2 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(10.1) write-amplify(4.0) OK, records in: 6551, records dropped: 1008 output_compression: NoCompression
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.276276) EVENT_LOG_v1 {"time_micros": 1759255999276260, "job": 34, "event": "compaction_finished", "compaction_time_micros": 53674, "compaction_time_cpu_micros": 23294, "output_level": 6, "num_output_files": 1, "total_output_size": 11017023, "num_input_records": 6551, "num_output_records": 5543, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999277476, "job": 34, "event": "table_file_deletion", "file_number": 59}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759255999286152, "job": 34, "event": "table_file_deletion", "file_number": 57}
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.219722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.286270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.286276) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.286278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.286279) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:13:19.286280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:13:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:13:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:19.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: ERROR   18:13:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: ERROR   18:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: ERROR   18:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: ERROR   18:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: ERROR   18:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:13:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:13:19 compute-1 nova_compute[238822]: 2025-09-30 18:13:19.578 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:19 compute-1 nova_compute[238822]: 2025-09-30 18:13:19.579 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:19.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:19 compute-1 sshd-session[271305]: Received disconnect from 194.107.115.65 port 23678:11: Bye Bye [preauth]
Sep 30 18:13:19 compute-1 sshd-session[271305]: Disconnected from invalid user jenkins 194.107.115.65 port 23678 [preauth]
Sep 30 18:13:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.087 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:13:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.644 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.645 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.657 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:13:20 compute-1 nova_compute[238822]: 2025-09-30 18:13:20.657 2 INFO nova.compute.claims [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:13:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:20.765 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:21 compute-1 ceph-mon[75484]: pgmap v1038: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Sep 30 18:13:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:21.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:21 compute-1 nova_compute[238822]: 2025-09-30 18:13:21.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:21.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:21 compute-1 nova_compute[238822]: 2025-09-30 18:13:21.731 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:21 compute-1 nova_compute[238822]: 2025-09-30 18:13:21.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:22 compute-1 nova_compute[238822]: 2025-09-30 18:13:22.254 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:22 compute-1 nova_compute[238822]: 2025-09-30 18:13:22.261 2 DEBUG nova.compute.provider_tree [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:13:22 compute-1 ceph-mon[75484]: pgmap v1039: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 24 KiB/s wr, 75 op/s
Sep 30 18:13:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/152161458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:22 compute-1 podman[271335]: 2025-09-30 18:13:22.520289786 +0000 UTC m=+0.056244654 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 18:13:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:22 compute-1 nova_compute[238822]: 2025-09-30 18:13:22.778 2 DEBUG nova.scheduler.client.report [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.290 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.645s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.291 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:13:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:13:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:23.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:23.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.802 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.803 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.804 2 WARNING neutronclient.v2_0.client [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:13:23 compute-1 nova_compute[238822]: 2025-09-30 18:13:23.805 2 WARNING neutronclient.v2_0.client [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:13:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098003390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2807150272' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:24 compute-1 ceph-mon[75484]: pgmap v1040: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 75 op/s
Sep 30 18:13:24 compute-1 nova_compute[238822]: 2025-09-30 18:13:24.315 2 INFO nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:13:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:24 compute-1 nova_compute[238822]: 2025-09-30 18:13:24.823 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:13:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:25.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:25.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.846 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.848 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.849 2 INFO nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Creating image(s)
Sep 30 18:13:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.883 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.921 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.958 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.964 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:25 compute-1 nova_compute[238822]: 2025-09-30 18:13:25.980 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Successfully created port: 9e86d507-897e-4992-a0d4-ef24306047ab _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.051 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.052 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.052 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.053 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.082 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.087 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 aa43d689-5cfc-489b-9635-36978f36b08c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.100 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.346 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 aa43d689-5cfc-489b-9635-36978f36b08c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.417 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] resizing rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.533 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.534 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Ensure instance console log exists: /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.534 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.534 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.534 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:26 compute-1 nova_compute[238822]: 2025-09-30 18:13:26.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:27 compute-1 ceph-mon[75484]: pgmap v1041: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 8.7 KiB/s wr, 70 op/s
Sep 30 18:13:27 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/636359647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:13:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:27.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.440 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Successfully updated port: 9e86d507-897e-4992-a0d4-ef24306047ab _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.516 2 DEBUG nova.compute.manager [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-changed-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.517 2 DEBUG nova.compute.manager [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Refreshing instance network info cache due to event network-changed-9e86d507-897e-4992-a0d4-ef24306047ab. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.517 2 DEBUG oslo_concurrency.lockutils [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.518 2 DEBUG oslo_concurrency.lockutils [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.518 2 DEBUG nova.network.neutron [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Refreshing network info cache for port 9e86d507-897e-4992-a0d4-ef24306047ab _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.567 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.568 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.569 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.569 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:27 compute-1 nova_compute[238822]: 2025-09-30 18:13:27.949 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:13:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:13:28 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/117727758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:28 compute-1 nova_compute[238822]: 2025-09-30 18:13:28.025 2 WARNING neutronclient.v2_0.client [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:13:28 compute-1 nova_compute[238822]: 2025-09-30 18:13:28.047 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:28 compute-1 nova_compute[238822]: 2025-09-30 18:13:28.104 2 DEBUG nova.network.neutron [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:13:28 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/117727758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:28 compute-1 podman[271548]: 2025-09-30 18:13:28.541949864 +0000 UTC m=+0.080071089 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 18:13:28 compute-1 podman[271549]: 2025-09-30 18:13:28.555313306 +0000 UTC m=+0.093727119 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Sep 30 18:13:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:28 compute-1 podman[271550]: 2025-09-30 18:13:28.575195604 +0000 UTC m=+0.099118545 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Sep 30 18:13:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:28 compute-1 nova_compute[238822]: 2025-09-30 18:13:28.676 2 DEBUG nova.network.neutron [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.089 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.090 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:13:29 compute-1 ceph-mon[75484]: pgmap v1042: 353 pgs: 353 active+clean; 248 MiB data, 307 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 8.7 KiB/s wr, 70 op/s
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.182 2 DEBUG oslo_concurrency.lockutils [req-6e2aded0-8be5-418d-bcc5-15f15e7cb9f1 req-a95d567c-c893-411a-8695-d3c708df931d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.184 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquired lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.184 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:13:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000055s ======
Sep 30 18:13:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:29.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.332 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.333 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.376 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.379 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4507MB free_disk=39.88016128540039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.379 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.380 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:29.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.784 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:13:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:29 compute-1 sudo[271612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:13:29 compute-1 sudo[271612]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:13:29 compute-1 sudo[271612]: pam_unix(sudo:session): session closed for user root
Sep 30 18:13:29 compute-1 nova_compute[238822]: 2025-09-30 18:13:29.974 2 WARNING neutronclient.v2_0.client [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.116 2 DEBUG nova.network.neutron [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Updating instance_info_cache with network_info: [{"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.425 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance dadc55d4-1578-4dc1-880a-08098fba63ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.425 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance aa43d689-5cfc-489b-9635-36978f36b08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.425 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.426 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:13:29 up  3:50,  0 user,  load average: 0.36, 1.05, 1.23\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_ddd1f985d8b64b449c79d55b0cbd6422': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_spawning': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.463 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.623 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Releasing lock "refresh_cache-aa43d689-5cfc-489b-9635-36978f36b08c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.625 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance network_info: |[{"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.629 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Start _get_guest_xml network_info=[{"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.635 2 WARNING nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.638 2 DEBUG nova.virt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1527130227', uuid='aa43d689-5cfc-489b-9635-36978f36b08c'), owner=OwnerMeta(userid='dc3bb71c425f484fbc46f90978029403', username='tempest-TestExecuteActionsViaActuator-837729328-project-admin', projectid='ddd1f985d8b64b449c79d55b0cbd6422', projectname='tempest-TestExecuteActionsViaActuator-837729328'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256010.637926) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.643 2 DEBUG nova.virt.libvirt.host [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.644 2 DEBUG nova.virt.libvirt.host [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.648 2 DEBUG nova.virt.libvirt.host [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.648 2 DEBUG nova.virt.libvirt.host [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.649 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.650 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.650 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.651 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.651 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.651 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.652 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.652 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.652 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.653 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.653 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.654 2 DEBUG nova.virt.hardware [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:13:30 compute-1 nova_compute[238822]: 2025-09-30 18:13:30.659 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:13:30 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3423975977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.011 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.016 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:13:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:13:31 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/814964195' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.154 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:31 compute-1 ceph-mon[75484]: pgmap v1043: 353 pgs: 353 active+clean; 327 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 162 op/s
Sep 30 18:13:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3423975977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/814964195' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.195 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.200 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:31.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:31 compute-1 unix_chkpwd[271722]: password check failed for user (root)
Sep 30 18:13:31 compute-1 sshd-session[271611]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.526 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:13:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:31.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:13:31 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2691203577' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.744 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.746 2 DEBUG nova.virt.libvirt.vif [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1527130227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1527130227',id=7,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-yipxsciv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:13:24Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=aa43d689-5cfc-489b-9635-36978f36b08c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.746 2 DEBUG nova.network.os_vif_util [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.747 2 DEBUG nova.network.os_vif_util [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.748 2 DEBUG nova.objects.instance [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'pci_devices' on Instance uuid aa43d689-5cfc-489b-9635-36978f36b08c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:13:31 compute-1 nova_compute[238822]: 2025-09-30 18:13:31.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.038 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.039 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.659s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2691203577' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.258 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <uuid>aa43d689-5cfc-489b-9635-36978f36b08c</uuid>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <name>instance-00000007</name>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1527130227</nova:name>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:13:30</nova:creationTime>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:13:32 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:13:32 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:user uuid="dc3bb71c425f484fbc46f90978029403">tempest-TestExecuteActionsViaActuator-837729328-project-admin</nova:user>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:project uuid="ddd1f985d8b64b449c79d55b0cbd6422">tempest-TestExecuteActionsViaActuator-837729328</nova:project>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <nova:port uuid="9e86d507-897e-4992-a0d4-ef24306047ab">
Sep 30 18:13:32 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <system>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="serial">aa43d689-5cfc-489b-9635-36978f36b08c</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="uuid">aa43d689-5cfc-489b-9635-36978f36b08c</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </system>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <os>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </os>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <features>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </features>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/aa43d689-5cfc-489b-9635-36978f36b08c_disk">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </source>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/aa43d689-5cfc-489b-9635-36978f36b08c_disk.config">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </source>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:13:32 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:f4:eb:b4"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <target dev="tap9e86d507-89"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/console.log" append="off"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <video>
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </video>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:13:32 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:13:32 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:13:32 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:13:32 compute-1 nova_compute[238822]: </domain>
Sep 30 18:13:32 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.258 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Preparing to wait for external event network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.259 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.259 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.259 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.260 2 DEBUG nova.virt.libvirt.vif [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1527130227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1527130227',id=7,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-yipxsciv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:13:24Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=aa43d689-5cfc-489b-9635-36978f36b08c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.260 2 DEBUG nova.network.os_vif_util [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.261 2 DEBUG nova.network.os_vif_util [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.261 2 DEBUG os_vif [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.262 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.262 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.263 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '200948a8-af65-5676-9a10-4f74f6183552', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.276 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e86d507-89, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.276 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9e86d507-89, col_values=(('qos', UUID('7cc5d503-2af9-4952-b7a1-828049921236')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9e86d507-89, col_values=(('external_ids', {'iface-id': '9e86d507-897e-4992-a0d4-ef24306047ab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:eb:b4', 'vm-uuid': 'aa43d689-5cfc-489b-9635-36978f36b08c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 NetworkManager[45549]: <info>  [1759256012.2811] manager: (tap9e86d507-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:32 compute-1 nova_compute[238822]: 2025-09-30 18:13:32.288 2 INFO os_vif [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89')
Sep 30 18:13:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:33 compute-1 ceph-mon[75484]: pgmap v1044: 353 pgs: 353 active+clean; 327 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 382 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:13:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:33.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:33.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:33 compute-1 sshd-session[271611]: Failed password for root from 192.210.160.141 port 37760 ssh2
Sep 30 18:13:33 compute-1 nova_compute[238822]: 2025-09-30 18:13:33.837 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:13:33 compute-1 nova_compute[238822]: 2025-09-30 18:13:33.837 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:13:33 compute-1 nova_compute[238822]: 2025-09-30 18:13:33.837 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No VIF found with MAC fa:16:3e:f4:eb:b4, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:13:33 compute-1 nova_compute[238822]: 2025-09-30 18:13:33.838 2 INFO nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Using config drive
Sep 30 18:13:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:33 compute-1 nova_compute[238822]: 2025-09-30 18:13:33.870 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.039 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.040 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.041 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.041 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.387 2 WARNING neutronclient.v2_0.client [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:13:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:34 compute-1 sshd-session[271611]: Connection closed by authenticating user root 192.210.160.141 port 37760 [preauth]
Sep 30 18:13:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.845 2 INFO nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Creating config drive at /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config
Sep 30 18:13:34 compute-1 nova_compute[238822]: 2025-09-30 18:13:34.856 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpdbjuuikw execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.003 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpdbjuuikw" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.032 2 DEBUG nova.storage.rbd_utils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image aa43d689-5cfc-489b-9635-36978f36b08c_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.037 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config aa43d689-5cfc-489b-9635-36978f36b08c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.198 2 DEBUG oslo_concurrency.processutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config aa43d689-5cfc-489b-9635-36978f36b08c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.199 2 INFO nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Deleting local config drive /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c/disk.config because it was imported into RBD.
Sep 30 18:13:35 compute-1 ceph-mon[75484]: pgmap v1045: 353 pgs: 353 active+clean; 327 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 382 KiB/s rd, 3.9 MiB/s wr, 93 op/s
Sep 30 18:13:35 compute-1 NetworkManager[45549]: <info>  [1759256015.2454] manager: (tap9e86d507-89): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Sep 30 18:13:35 compute-1 kernel: tap9e86d507-89: entered promiscuous mode
Sep 30 18:13:35 compute-1 ovn_controller[135204]: 2025-09-30T18:13:35Z|00054|binding|INFO|Claiming lport 9e86d507-897e-4992-a0d4-ef24306047ab for this chassis.
Sep 30 18:13:35 compute-1 ovn_controller[135204]: 2025-09-30T18:13:35Z|00055|binding|INFO|9e86d507-897e-4992-a0d4-ef24306047ab: Claiming fa:16:3e:f4:eb:b4 10.100.0.10
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.254 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:eb:b4 10.100.0.10'], port_security=['fa:16:3e:f4:eb:b4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'aa43d689-5cfc-489b-9635-36978f36b08c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=9e86d507-897e-4992-a0d4-ef24306047ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.255 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 9e86d507-897e-4992-a0d4-ef24306047ab in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab bound to our chassis
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.257 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:13:35 compute-1 ovn_controller[135204]: 2025-09-30T18:13:35Z|00056|binding|INFO|Setting lport 9e86d507-897e-4992-a0d4-ef24306047ab ovn-installed in OVS
Sep 30 18:13:35 compute-1 ovn_controller[135204]: 2025-09-30T18:13:35Z|00057|binding|INFO|Setting lport 9e86d507-897e-4992-a0d4-ef24306047ab up in Southbound
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:35 compute-1 systemd-udevd[271802]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.278 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[286d579a-3b49-436b-8c96-f9cfe807a941]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 systemd-machined[195911]: New machine qemu-3-instance-00000007.
Sep 30 18:13:35 compute-1 NetworkManager[45549]: <info>  [1759256015.2902] device (tap9e86d507-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:13:35 compute-1 NetworkManager[45549]: <info>  [1759256015.2912] device (tap9e86d507-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:13:35 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Sep 30 18:13:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:35.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.314 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[edb58af9-2be0-4b65-a857-ca7c38d9cc7d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.317 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[f67af163-3c2a-40de-a3ef-fe5e5375bcf6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.343 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[8d20c619-33ad-4592-971f-c9d614bda160]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.358 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[61a0e7a2-6cd7-4e84-8782-e79e84b6755e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271814, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.373 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4af72633-c35b-44aa-a3de-902c5f51775d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271816, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271816, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.374 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.376 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.377 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.377 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.377 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:13:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:35.378 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5d92e0df-e7c8-4459-8ae1-21dccf2d7f9e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.450 2 DEBUG nova.compute.manager [req-78e72ea1-b252-44a3-ab7e-e8ad2ea096a5 req-ac63a7ca-560e-4ec2-9cd5-8ad2a774ad88 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.450 2 DEBUG oslo_concurrency.lockutils [req-78e72ea1-b252-44a3-ab7e-e8ad2ea096a5 req-ac63a7ca-560e-4ec2-9cd5-8ad2a774ad88 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.450 2 DEBUG oslo_concurrency.lockutils [req-78e72ea1-b252-44a3-ab7e-e8ad2ea096a5 req-ac63a7ca-560e-4ec2-9cd5-8ad2a774ad88 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.451 2 DEBUG oslo_concurrency.lockutils [req-78e72ea1-b252-44a3-ab7e-e8ad2ea096a5 req-ac63a7ca-560e-4ec2-9cd5-8ad2a774ad88 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:35 compute-1 nova_compute[238822]: 2025-09-30 18:13:35.451 2 DEBUG nova.compute.manager [req-78e72ea1-b252-44a3-ab7e-e8ad2ea096a5 req-ac63a7ca-560e-4ec2-9cd5-8ad2a774ad88 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Processing event network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:13:35 compute-1 podman[249638]: time="2025-09-30T18:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:13:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:13:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9276 "" "Go-http-client/1.1"
Sep 30 18:13:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:35.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.108 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.111 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.115 2 INFO nova.virt.libvirt.driver [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance spawned successfully.
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.116 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:13:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.629 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.629 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.630 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.630 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.631 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.631 2 DEBUG nova.virt.libvirt.driver [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:13:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:36 compute-1 nova_compute[238822]: 2025-09-30 18:13:36.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.141 2 INFO nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Took 11.29 seconds to spawn the instance on the hypervisor.
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.141 2 DEBUG nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:13:37 compute-1 ceph-mon[75484]: pgmap v1046: 353 pgs: 353 active+clean; 327 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 382 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:13:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/26377187' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:13:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/26377187' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:37.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.514 2 DEBUG nova.compute.manager [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.514 2 DEBUG oslo_concurrency.lockutils [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.515 2 DEBUG oslo_concurrency.lockutils [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.515 2 DEBUG oslo_concurrency.lockutils [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.516 2 DEBUG nova.compute.manager [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] No waiting events found dispatching network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.516 2 WARNING nova.compute.manager [req-b99ab05a-3d58-463a-87f0-3d8c3ba1d73a req-6d988423-5895-480e-b0db-5d441a69f4ff 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received unexpected event network-vif-plugged-9e86d507-897e-4992-a0d4-ef24306047ab for instance with vm_state active and task_state None.
Sep 30 18:13:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:37 compute-1 nova_compute[238822]: 2025-09-30 18:13:37.684 2 INFO nova.compute.manager [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Took 17.09 seconds to build instance.
Sep 30 18:13:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:37.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:38 compute-1 nova_compute[238822]: 2025-09-30 18:13:38.191 2 DEBUG oslo_concurrency.lockutils [None req-4359b516-987d-4fac-adab-4053b0225ce4 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.612s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:13:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:39 compute-1 ceph-mon[75484]: pgmap v1047: 353 pgs: 353 active+clean; 327 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 382 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:13:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:39.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:39.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c40089f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:40 compute-1 ceph-mon[75484]: pgmap v1048: 353 pgs: 353 active+clean; 328 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 166 op/s
Sep 30 18:13:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:41.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:41.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:41 compute-1 nova_compute[238822]: 2025-09-30 18:13:41.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:42 compute-1 nova_compute[238822]: 2025-09-30 18:13:42.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:42 compute-1 sshd-session[270467]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:13:42 compute-1 sshd-session[270467]: banner exchange: Connection from 113.249.93.94 port 32286: Connection timed out
Sep 30 18:13:43 compute-1 ceph-mon[75484]: pgmap v1049: 353 pgs: 353 active+clean; 328 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 74 op/s
Sep 30 18:13:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:43.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c40089f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:45 compute-1 ceph-mon[75484]: pgmap v1050: 353 pgs: 353 active+clean; 328 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 29 KiB/s wr, 74 op/s
Sep 30 18:13:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:13:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:45.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:13:45 compute-1 podman[271874]: 2025-09-30 18:13:45.553356653 +0000 UTC m=+0.084787367 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:13:45 compute-1 podman[271873]: 2025-09-30 18:13:45.620680686 +0000 UTC m=+0.149738606 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:13:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:45.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:46 compute-1 sshd-session[271923]: Invalid user vas from 107.172.146.104 port 60260
Sep 30 18:13:46 compute-1 sshd-session[271923]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:13:46 compute-1 sshd-session[271923]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:13:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:46 compute-1 nova_compute[238822]: 2025-09-30 18:13:46.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:47 compute-1 ceph-mon[75484]: pgmap v1051: 353 pgs: 353 active+clean; 328 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:13:47 compute-1 nova_compute[238822]: 2025-09-30 18:13:47.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:47.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:47.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2897963439' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:13:48 compute-1 sshd-session[271923]: Failed password for invalid user vas from 107.172.146.104 port 60260 ssh2
Sep 30 18:13:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:48 compute-1 sshd-session[271923]: Received disconnect from 107.172.146.104 port 60260:11: Bye Bye [preauth]
Sep 30 18:13:48 compute-1 sshd-session[271923]: Disconnected from invalid user vas 107.172.146.104 port 60260 [preauth]
Sep 30 18:13:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:49 compute-1 ceph-mon[75484]: pgmap v1052: 353 pgs: 353 active+clean; 328 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:13:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:49.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: ERROR   18:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: ERROR   18:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: ERROR   18:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: ERROR   18:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: ERROR   18:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:13:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:13:49 compute-1 ovn_controller[135204]: 2025-09-30T18:13:49Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:eb:b4 10.100.0.10
Sep 30 18:13:49 compute-1 ovn_controller[135204]: 2025-09-30T18:13:49Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:eb:b4 10.100.0.10
Sep 30 18:13:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:49.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:49 compute-1 sudo[271928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:13:49 compute-1 sudo[271928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:13:49 compute-1 sudo[271928]: pam_unix(sudo:session): session closed for user root
Sep 30 18:13:50 compute-1 ceph-mon[75484]: pgmap v1053: 353 pgs: 353 active+clean; 333 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 537 KiB/s wr, 86 op/s
Sep 30 18:13:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:51.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:51.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:51 compute-1 nova_compute[238822]: 2025-09-30 18:13:51.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:52 compute-1 nova_compute[238822]: 2025-09-30 18:13:52.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:53 compute-1 ceph-mon[75484]: pgmap v1054: 353 pgs: 353 active+clean; 333 MiB data, 353 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 524 KiB/s wr, 12 op/s
Sep 30 18:13:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:13:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:53.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:53 compute-1 podman[271957]: 2025-09-30 18:13:53.531962546 +0000 UTC m=+0.077863859 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:13:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:53.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:54.357 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:13:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:54.358 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:13:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:13:54.360 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:13:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:55 compute-1 ceph-mon[75484]: pgmap v1055: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:13:55 compute-1 unix_chkpwd[271983]: password check failed for user (root)
Sep 30 18:13:55 compute-1 sshd-session[271977]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:13:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:55.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:55.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:56 compute-1 sudo[271985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:13:56 compute-1 sudo[271985]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:13:56 compute-1 sudo[271985]: pam_unix(sudo:session): session closed for user root
Sep 30 18:13:56 compute-1 nova_compute[238822]: 2025-09-30 18:13:56.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:56 compute-1 unix_chkpwd[272013]: password check failed for user (root)
Sep 30 18:13:56 compute-1 sshd-session[271980]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:13:56 compute-1 sudo[272011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:13:56 compute-1 sudo[272011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:13:57 compute-1 sshd-session[271977]: Failed password for root from 84.51.43.58 port 55195 ssh2
Sep 30 18:13:57 compute-1 ceph-mon[75484]: pgmap v1056: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:13:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3792961879' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2584726180' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:13:57 compute-1 nova_compute[238822]: 2025-09-30 18:13:57.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:13:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:13:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:57.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:13:57 compute-1 sudo[272011]: pam_unix(sudo:session): session closed for user root
Sep 30 18:13:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:57.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/131746376' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/131746376' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:13:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:13:58 compute-1 sshd-session[271977]: Received disconnect from 84.51.43.58 port 55195:11: Bye Bye [preauth]
Sep 30 18:13:58 compute-1 sshd-session[271977]: Disconnected from authenticating user root 84.51.43.58 port 55195 [preauth]
Sep 30 18:13:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:13:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:58 compute-1 sshd-session[271980]: Failed password for root from 192.210.160.141 port 52516 ssh2
Sep 30 18:13:59 compute-1 ceph-mon[75484]: pgmap v1057: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:13:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:13:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:13:59.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:13:59 compute-1 podman[272073]: 2025-09-30 18:13:59.561697524 +0000 UTC m=+0.089648549 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:13:59 compute-1 podman[272071]: 2025-09-30 18:13:59.572400334 +0000 UTC m=+0.105709474 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:13:59 compute-1 podman[272072]: 2025-09-30 18:13:59.653786257 +0000 UTC m=+0.178948896 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Sep 30 18:13:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:13:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:13:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:13:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:13:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:13:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:13:59.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:13:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:13:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:13:59 compute-1 sshd-session[271980]: Connection closed by authenticating user root 192.210.160.141 port 52516 [preauth]
Sep 30 18:14:00 compute-1 ceph-mon[75484]: pgmap v1058: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:14:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:01 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:01.017 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:14:01 compute-1 nova_compute[238822]: 2025-09-30 18:14:01.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:01 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:01.019 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:14:01 compute-1 sshd-session[272132]: Invalid user seekcy from 216.10.242.161 port 33114
Sep 30 18:14:01 compute-1 sshd-session[272132]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:01 compute-1 sshd-session[272132]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:14:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:01.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:01 compute-1 nova_compute[238822]: 2025-09-30 18:14:01.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:02 compute-1 nova_compute[238822]: 2025-09-30 18:14:02.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:03 compute-1 sudo[272141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:14:03 compute-1 sudo[272141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:14:03 compute-1 sudo[272141]: pam_unix(sudo:session): session closed for user root
Sep 30 18:14:03 compute-1 ceph-mon[75484]: pgmap v1059: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 314 KiB/s rd, 3.4 MiB/s wr, 78 op/s
Sep 30 18:14:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:14:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:14:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:03.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:03 compute-1 sshd-session[272132]: Failed password for invalid user seekcy from 216.10.242.161 port 33114 ssh2
Sep 30 18:14:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:03.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:04.021 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:04 compute-1 sshd-session[272138]: Invalid user ftpclient from 14.225.167.110 port 47308
Sep 30 18:14:04 compute-1 sshd-session[272138]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:04 compute-1 sshd-session[272138]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:14:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:05 compute-1 sshd-session[272132]: Received disconnect from 216.10.242.161 port 33114:11: Bye Bye [preauth]
Sep 30 18:14:05 compute-1 sshd-session[272132]: Disconnected from invalid user seekcy 216.10.242.161 port 33114 [preauth]
Sep 30 18:14:05 compute-1 ceph-mon[75484]: pgmap v1060: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 1.8 MiB/s rd, 3.4 MiB/s wr, 140 op/s
Sep 30 18:14:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:05.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:05 compute-1 podman[249638]: time="2025-09-30T18:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:14:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:14:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9282 "" "Go-http-client/1.1"
Sep 30 18:14:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:05.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:06 compute-1 sshd-session[272138]: Failed password for invalid user ftpclient from 14.225.167.110 port 47308 ssh2
Sep 30 18:14:06 compute-1 ceph-mon[75484]: pgmap v1061: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 33 KiB/s wr, 62 op/s
Sep 30 18:14:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:06 compute-1 nova_compute[238822]: 2025-09-30 18:14:06.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:06 compute-1 nova_compute[238822]: 2025-09-30 18:14:06.851 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:06 compute-1 nova_compute[238822]: 2025-09-30 18:14:06.852 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:07 compute-1 sshd-session[272138]: Received disconnect from 14.225.167.110 port 47308:11: Bye Bye [preauth]
Sep 30 18:14:07 compute-1 sshd-session[272138]: Disconnected from invalid user ftpclient 14.225.167.110 port 47308 [preauth]
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:07.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.359 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:14:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:14:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:07.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.945 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.945 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.957 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:14:07 compute-1 nova_compute[238822]: 2025-09-30 18:14:07.957 2 INFO nova.compute.claims [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:14:08 compute-1 ceph-mon[75484]: pgmap v1062: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 33 KiB/s wr, 62 op/s
Sep 30 18:14:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:09 compute-1 nova_compute[238822]: 2025-09-30 18:14:09.051 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:09.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:14:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3791510506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:09 compute-1 nova_compute[238822]: 2025-09-30 18:14:09.513 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:09 compute-1 nova_compute[238822]: 2025-09-30 18:14:09.521 2 DEBUG nova.compute.provider_tree [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:14:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3791510506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:09.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:10 compute-1 nova_compute[238822]: 2025-09-30 18:14:10.032 2 DEBUG nova.scheduler.client.report [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:14:10 compute-1 sudo[272195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:14:10 compute-1 sudo[272195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:14:10 compute-1 sudo[272195]: pam_unix(sudo:session): session closed for user root
Sep 30 18:14:10 compute-1 nova_compute[238822]: 2025-09-30 18:14:10.544 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.599s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:10 compute-1 nova_compute[238822]: 2025-09-30 18:14:10.545 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:14:10 compute-1 ceph-mon[75484]: pgmap v1063: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 34 KiB/s wr, 76 op/s
Sep 30 18:14:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.058 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.059 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.060 2 WARNING neutronclient.v2_0.client [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.061 2 WARNING neutronclient.v2_0.client [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:11.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.572 2 INFO nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:14:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:11.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:11 compute-1 nova_compute[238822]: 2025-09-30 18:14:11.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.083 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.181 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Successfully created port: 2b9945fb-1c9f-4952-9d4f-176df1016c31 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.781 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Successfully updated port: 2b9945fb-1c9f-4952-9d4f-176df1016c31 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.833 2 DEBUG nova.compute.manager [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-changed-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.833 2 DEBUG nova.compute.manager [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Refreshing instance network info cache due to event network-changed-2b9945fb-1c9f-4952-9d4f-176df1016c31. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.834 2 DEBUG oslo_concurrency.lockutils [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.834 2 DEBUG oslo_concurrency.lockutils [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:12 compute-1 nova_compute[238822]: 2025-09-30 18:14:12.835 2 DEBUG nova.network.neutron [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Refreshing network info cache for port 2b9945fb-1c9f-4952-9d4f-176df1016c31 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.106 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.108 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.108 2 INFO nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Creating image(s)
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.143 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.182 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:13 compute-1 ceph-mon[75484]: pgmap v1064: 353 pgs: 353 active+clean; 407 MiB data, 400 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 20 KiB/s wr, 75 op/s
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.321 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.327 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.345 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.348 2 WARNING neutronclient.v2_0.client [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:14:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:13.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.421 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.422 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.422 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.423 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.458 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.464 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:13.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.795 2 DEBUG nova.network.neutron [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:14:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:13 compute-1 nova_compute[238822]: 2025-09-30 18:14:13.954 2 DEBUG nova.network.neutron [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.300 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.836s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:14 compute-1 ceph-mon[75484]: pgmap v1065: 353 pgs: 353 active+clean; 418 MiB data, 411 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.3 MiB/s wr, 96 op/s
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.413 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] resizing rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.486 2 DEBUG oslo_concurrency.lockutils [req-c864334f-9156-4b64-8dcd-b2f4c24b7156 req-4db05ccd-c8db-4ee7-ad99-22f2f3b2d443 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.488 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquired lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.488 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:14:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.714 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.714 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Ensure instance console log exists: /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.715 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.715 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:14 compute-1 nova_compute[238822]: 2025-09-30 18:14:14.715 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:15.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:15.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:15 compute-1 nova_compute[238822]: 2025-09-30 18:14:15.807 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:14:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:16 compute-1 podman[272395]: 2025-09-30 18:14:16.566099792 +0000 UTC m=+0.092643679 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:14:16 compute-1 podman[272394]: 2025-09-30 18:14:16.609289922 +0000 UTC m=+0.141596565 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Sep 30 18:14:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:16 compute-1 nova_compute[238822]: 2025-09-30 18:14:16.786 2 WARNING neutronclient.v2_0.client [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:16 compute-1 nova_compute[238822]: 2025-09-30 18:14:16.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:16 compute-1 nova_compute[238822]: 2025-09-30 18:14:16.954 2 DEBUG nova.network.neutron [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Updating instance_info_cache with network_info: [{"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:17 compute-1 ceph-mon[75484]: pgmap v1066: 353 pgs: 353 active+clean; 418 MiB data, 411 MiB used, 40 GiB / 40 GiB avail; 442 KiB/s rd, 1.3 MiB/s wr, 34 op/s
Sep 30 18:14:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:17.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.461 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Releasing lock "refresh_cache-a17c77e1-0404-4b3e-b04a-7d5a03566e47" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.462 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance network_info: |[{"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.466 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Start _get_guest_xml network_info=[{"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.471 2 WARNING nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.473 2 DEBUG nova.virt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-673327438', uuid='a17c77e1-0404-4b3e-b04a-7d5a03566e47'), owner=OwnerMeta(userid='dc3bb71c425f484fbc46f90978029403', username='tempest-TestExecuteActionsViaActuator-837729328-project-admin', projectid='ddd1f985d8b64b449c79d55b0cbd6422', projectname='tempest-TestExecuteActionsViaActuator-837729328'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256057.473743) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.479 2 DEBUG nova.virt.libvirt.host [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.481 2 DEBUG nova.virt.libvirt.host [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.486 2 DEBUG nova.virt.libvirt.host [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.487 2 DEBUG nova.virt.libvirt.host [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.487 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.488 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.489 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.489 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.489 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.489 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.490 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.490 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.490 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.491 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.491 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.491 2 DEBUG nova.virt.hardware [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:14:17 compute-1 nova_compute[238822]: 2025-09-30 18:14:17.495 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:17.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:14:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3792421474' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.006 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.031 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.037 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3792421474' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:14:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1209878573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.534 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.537 2 DEBUG nova.virt.libvirt.vif [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:14:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-673327438',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-673327438',id=9,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-pjsyk5hs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:14:12Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=a17c77e1-0404-4b3e-b04a-7d5a03566e47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.537 2 DEBUG nova.network.os_vif_util [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.539 2 DEBUG nova.network.os_vif_util [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:14:18 compute-1 nova_compute[238822]: 2025-09-30 18:14:18.541 2 DEBUG nova.objects.instance [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'pci_devices' on Instance uuid a17c77e1-0404-4b3e-b04a-7d5a03566e47 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.052 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <uuid>a17c77e1-0404-4b3e-b04a-7d5a03566e47</uuid>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <name>instance-00000009</name>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-673327438</nova:name>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:14:17</nova:creationTime>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:14:19 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:14:19 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:user uuid="dc3bb71c425f484fbc46f90978029403">tempest-TestExecuteActionsViaActuator-837729328-project-admin</nova:user>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:project uuid="ddd1f985d8b64b449c79d55b0cbd6422">tempest-TestExecuteActionsViaActuator-837729328</nova:project>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <nova:port uuid="2b9945fb-1c9f-4952-9d4f-176df1016c31">
Sep 30 18:14:19 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <system>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="serial">a17c77e1-0404-4b3e-b04a-7d5a03566e47</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="uuid">a17c77e1-0404-4b3e-b04a-7d5a03566e47</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </system>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <os>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </os>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <features>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </features>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </source>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </source>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:14:19 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:33:09:fa"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <target dev="tap2b9945fb-1c"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/console.log" append="off"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <video>
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </video>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:14:19 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:14:19 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:14:19 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:14:19 compute-1 nova_compute[238822]: </domain>
Sep 30 18:14:19 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.053 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Preparing to wait for external event network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.054 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.054 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.055 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.056 2 DEBUG nova.virt.libvirt.vif [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:14:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-673327438',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-673327438',id=9,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-pjsyk5hs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:14:12Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=a17c77e1-0404-4b3e-b04a-7d5a03566e47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.057 2 DEBUG nova.network.os_vif_util [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.058 2 DEBUG nova.network.os_vif_util [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.058 2 DEBUG os_vif [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '90d9751d-1367-5909-8a7a-718bdd7d0011', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.071 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b9945fb-1c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.072 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2b9945fb-1c, col_values=(('qos', UUID('04a1b748-1af6-4086-b282-a5287206f5be')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.073 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2b9945fb-1c, col_values=(('external_ids', {'iface-id': '2b9945fb-1c9f-4952-9d4f-176df1016c31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:09:fa', 'vm-uuid': 'a17c77e1-0404-4b3e-b04a-7d5a03566e47'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 NetworkManager[45549]: <info>  [1759256059.0761] manager: (tap2b9945fb-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:19 compute-1 nova_compute[238822]: 2025-09-30 18:14:19.088 2 INFO os_vif [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c')
Sep 30 18:14:19 compute-1 ceph-mon[75484]: pgmap v1067: 353 pgs: 353 active+clean; 418 MiB data, 411 MiB used, 40 GiB / 40 GiB avail; 442 KiB/s rd, 1.3 MiB/s wr, 34 op/s
Sep 30 18:14:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1209878573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:19.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: ERROR   18:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: ERROR   18:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: ERROR   18:14:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: ERROR   18:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: ERROR   18:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:14:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:14:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:19.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:19 compute-1 sshd-session[272444]: Invalid user elk from 113.249.93.94 port 61158
Sep 30 18:14:19 compute-1 sshd-session[272444]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:19 compute-1 sshd-session[272444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.249.93.94
Sep 30 18:14:20 compute-1 ceph-mon[75484]: pgmap v1068: 353 pgs: 353 active+clean; 486 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 665 KiB/s rd, 3.9 MiB/s wr, 101 op/s
Sep 30 18:14:20 compute-1 nova_compute[238822]: 2025-09-30 18:14:20.653 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:14:20 compute-1 nova_compute[238822]: 2025-09-30 18:14:20.654 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:14:20 compute-1 nova_compute[238822]: 2025-09-30 18:14:20.654 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] No VIF found with MAC fa:16:3e:33:09:fa, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:14:20 compute-1 nova_compute[238822]: 2025-09-30 18:14:20.655 2 INFO nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Using config drive
Sep 30 18:14:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:20 compute-1 nova_compute[238822]: 2025-09-30 18:14:20.695 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.213 2 WARNING neutronclient.v2_0.client [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:21.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.606 2 INFO nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Creating config drive at /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.618 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp1efv3wuf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.765 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp1efv3wuf" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:21.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.811 2 DEBUG nova.storage.rbd_utils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] rbd image a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.817 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:21 compute-1 nova_compute[238822]: 2025-09-30 18:14:21.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:21 compute-1 sshd-session[272444]: Failed password for invalid user elk from 113.249.93.94 port 61158 ssh2
Sep 30 18:14:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.000 2 DEBUG oslo_concurrency.processutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config a17c77e1-0404-4b3e-b04a-7d5a03566e47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.183s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.001 2 INFO nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Deleting local config drive /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47/disk.config because it was imported into RBD.
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:22 compute-1 kernel: tap2b9945fb-1c: entered promiscuous mode
Sep 30 18:14:22 compute-1 NetworkManager[45549]: <info>  [1759256062.0768] manager: (tap2b9945fb-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Sep 30 18:14:22 compute-1 ovn_controller[135204]: 2025-09-30T18:14:22Z|00058|binding|INFO|Claiming lport 2b9945fb-1c9f-4952-9d4f-176df1016c31 for this chassis.
Sep 30 18:14:22 compute-1 ovn_controller[135204]: 2025-09-30T18:14:22Z|00059|binding|INFO|2b9945fb-1c9f-4952-9d4f-176df1016c31: Claiming fa:16:3e:33:09:fa 10.100.0.4
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.093 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:09:fa 10.100.0.4'], port_security=['fa:16:3e:33:09:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a17c77e1-0404-4b3e-b04a-7d5a03566e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=2b9945fb-1c9f-4952-9d4f-176df1016c31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.094 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 2b9945fb-1c9f-4952-9d4f-176df1016c31 in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab bound to our chassis
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.096 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:22 compute-1 ovn_controller[135204]: 2025-09-30T18:14:22Z|00060|binding|INFO|Setting lport 2b9945fb-1c9f-4952-9d4f-176df1016c31 ovn-installed in OVS
Sep 30 18:14:22 compute-1 ovn_controller[135204]: 2025-09-30T18:14:22Z|00061|binding|INFO|Setting lport 2b9945fb-1c9f-4952-9d4f-176df1016c31 up in Southbound
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:22 compute-1 systemd-udevd[272591]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.128 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e7805028-0b2f-4bcf-b0ef-c66eb9208abc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 systemd-machined[195911]: New machine qemu-4-instance-00000009.
Sep 30 18:14:22 compute-1 NetworkManager[45549]: <info>  [1759256062.1510] device (tap2b9945fb-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:14:22 compute-1 NetworkManager[45549]: <info>  [1759256062.1516] device (tap2b9945fb-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:14:22 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.176 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a7039130-aff8-44c5-a2d2-3fa30e23f899]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.181 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8c0926-ba11-49f2-a4d9-c2cc6e4a132b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.221 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[b87d4dac-5774-4644-aeba-1e3fea30efcd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.238 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1e2738-d63b-4910-aa85-ec9de2859aaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272605, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 sshd-session[272514]: Invalid user mysql from 192.210.160.141 port 42776
Sep 30 18:14:22 compute-1 sshd-session[272514]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:22 compute-1 sshd-session[272514]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.256 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[54fe9715-c0b1-49b5-8b1b-27319b8a11fa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272606, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272606, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.259 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.264 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.264 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.264 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.265 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:22 compute-1 sshd-session[272444]: Received disconnect from 113.249.93.94 port 61158:11: Bye Bye [preauth]
Sep 30 18:14:22 compute-1 sshd-session[272444]: Disconnected from invalid user elk 113.249.93.94 port 61158 [preauth]
Sep 30 18:14:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:22.266 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[555b6456-51fa-4d98-84da-1e1c363f8bdc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:22 compute-1 unix_chkpwd[272607]: password check failed for user (root)
Sep 30 18:14:22 compute-1 sshd-session[272535]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:14:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.984 2 DEBUG nova.compute.manager [req-beb59adc-c502-4116-97a8-371ccb6a0623 req-98742943-cb23-46c4-b8f9-b753d6f6eb41 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.984 2 DEBUG oslo_concurrency.lockutils [req-beb59adc-c502-4116-97a8-371ccb6a0623 req-98742943-cb23-46c4-b8f9-b753d6f6eb41 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.985 2 DEBUG oslo_concurrency.lockutils [req-beb59adc-c502-4116-97a8-371ccb6a0623 req-98742943-cb23-46c4-b8f9-b753d6f6eb41 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.985 2 DEBUG oslo_concurrency.lockutils [req-beb59adc-c502-4116-97a8-371ccb6a0623 req-98742943-cb23-46c4-b8f9-b753d6f6eb41 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:22 compute-1 nova_compute[238822]: 2025-09-30 18:14:22.986 2 DEBUG nova.compute.manager [req-beb59adc-c502-4116-97a8-371ccb6a0623 req-98742943-cb23-46c4-b8f9-b753d6f6eb41 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Processing event network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.102 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.110 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.115 2 INFO nova.virt.libvirt.driver [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance spawned successfully.
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.116 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:14:23 compute-1 ceph-mon[75484]: pgmap v1069: 353 pgs: 353 active+clean; 486 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 253 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:14:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:14:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:23.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.631 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.632 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.633 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.634 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.634 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 nova_compute[238822]: 2025-09-30 18:14:23.635 2 DEBUG nova.virt.libvirt.driver [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:14:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:23.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.144 2 INFO nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Took 11.04 seconds to spawn the instance on the hypervisor.
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.144 2 DEBUG nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:14:24 compute-1 sshd-session[272514]: Failed password for invalid user mysql from 192.210.160.141 port 42776 ssh2
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.563 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.563 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:14:24 compute-1 podman[272652]: 2025-09-30 18:14:24.597115865 +0000 UTC m=+0.119840966 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:14:24 compute-1 sshd-session[272535]: Failed password for root from 175.126.165.170 port 33014 ssh2
Sep 30 18:14:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:24 compute-1 nova_compute[238822]: 2025-09-30 18:14:24.692 2 INFO nova.compute.manager [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Took 16.82 seconds to build instance.
Sep 30 18:14:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:24 compute-1 sshd-session[272514]: Connection closed by invalid user mysql 192.210.160.141 port 42776 [preauth]
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.071 2 DEBUG nova.compute.manager [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.072 2 DEBUG oslo_concurrency.lockutils [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.073 2 DEBUG oslo_concurrency.lockutils [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.073 2 DEBUG oslo_concurrency.lockutils [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.074 2 DEBUG nova.compute.manager [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] No waiting events found dispatching network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.074 2 WARNING nova.compute.manager [req-9a606fde-0baa-4a3a-ac55-4effff4c64d9 req-929212d2-e81f-48a6-97cd-4a1116597f9c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received unexpected event network-vif-plugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 for instance with vm_state active and task_state None.
Sep 30 18:14:25 compute-1 ceph-mon[75484]: pgmap v1070: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 261 KiB/s rd, 3.9 MiB/s wr, 99 op/s
Sep 30 18:14:25 compute-1 nova_compute[238822]: 2025-09-30 18:14:25.200 2 DEBUG oslo_concurrency.lockutils [None req-4f617e17-0179-43ba-87d8-20f601e09e1c dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.348s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:25 compute-1 sshd-session[272535]: Received disconnect from 175.126.165.170 port 33014:11: Bye Bye [preauth]
Sep 30 18:14:25 compute-1 sshd-session[272535]: Disconnected from authenticating user root 175.126.165.170 port 33014 [preauth]
Sep 30 18:14:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:14:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:25.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:14:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:25.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1198616533' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:26 compute-1 nova_compute[238822]: 2025-09-30 18:14:26.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:27 compute-1 sshd-session[272673]: Invalid user agent from 194.107.115.65 port 48152
Sep 30 18:14:27 compute-1 sshd-session[272673]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:27 compute-1 sshd-session[272673]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:14:27 compute-1 ceph-mon[75484]: pgmap v1071: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 230 KiB/s rd, 2.7 MiB/s wr, 77 op/s
Sep 30 18:14:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:27.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:27.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:14:28 compute-1 nova_compute[238822]: 2025-09-30 18:14:28.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:28 compute-1 sshd-session[272673]: Failed password for invalid user agent from 194.107.115.65 port 48152 ssh2
Sep 30 18:14:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:14:29 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/919828019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.074 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:29 compute-1 ceph-mon[75484]: pgmap v1072: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 230 KiB/s rd, 2.7 MiB/s wr, 77 op/s
Sep 30 18:14:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/817885264' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/919828019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:29.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.677 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.678 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.684 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.685 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.690 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.690 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:14:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:29.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:29 compute-1 sshd-session[272673]: Received disconnect from 194.107.115.65 port 48152:11: Bye Bye [preauth]
Sep 30 18:14:29 compute-1 sshd-session[272673]: Disconnected from invalid user agent 194.107.115.65 port 48152 [preauth]
Sep 30 18:14:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.924 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.925 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.947 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.948 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4173MB free_disk=39.74360275268555GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.949 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:29 compute-1 nova_compute[238822]: 2025-09-30 18:14:29.949 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:30 compute-1 sshd[170789]: drop connection #1 from [110.42.70.108]:43932 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:14:30 compute-1 sudo[272705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:14:30 compute-1 sudo[272705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:14:30 compute-1 sudo[272705]: pam_unix(sudo:session): session closed for user root
Sep 30 18:14:30 compute-1 podman[272729]: 2025-09-30 18:14:30.308952954 +0000 UTC m=+0.094028247 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:14:30 compute-1 podman[272736]: 2025-09-30 18:14:30.315173842 +0000 UTC m=+0.082847854 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:14:30 compute-1 podman[272730]: 2025-09-30 18:14:30.337926548 +0000 UTC m=+0.102824265 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:14:30 compute-1 sshd-session[272701]: Invalid user wifi from 167.172.43.167 port 37840
Sep 30 18:14:30 compute-1 sshd-session[272701]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:14:30 compute-1 sshd-session[272701]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:14:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.009 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance dadc55d4-1578-4dc1-880a-08098fba63ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.010 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance aa43d689-5cfc-489b-9635-36978f36b08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.010 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance a17c77e1-0404-4b3e-b04a-7d5a03566e47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.011 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.011 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=39GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:14:29 up  3:51,  0 user,  load average: 0.67, 1.01, 1.20\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_ddd1f985d8b64b449c79d55b0cbd6422': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.077 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:31 compute-1 ceph-mon[75484]: pgmap v1073: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 2.7 MiB/s wr, 141 op/s
Sep 30 18:14:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:31.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:14:31 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2989069023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.588 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.596 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:14:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:31.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:31 compute-1 nova_compute[238822]: 2025-09-30 18:14:31.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:32 compute-1 sshd-session[272701]: Failed password for invalid user wifi from 167.172.43.167 port 37840 ssh2
Sep 30 18:14:32 compute-1 nova_compute[238822]: 2025-09-30 18:14:32.104 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:14:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2989069023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:32 compute-1 nova_compute[238822]: 2025-09-30 18:14:32.616 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:14:32 compute-1 nova_compute[238822]: 2025-09-30 18:14:32.617 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.668s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:32 compute-1 nova_compute[238822]: 2025-09-30 18:14:32.617 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:32 compute-1 nova_compute[238822]: 2025-09-30 18:14:32.617 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:14:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:32 compute-1 sshd-session[272701]: Received disconnect from 167.172.43.167 port 37840:11: Bye Bye [preauth]
Sep 30 18:14:32 compute-1 sshd-session[272701]: Disconnected from invalid user wifi 167.172.43.167 port 37840 [preauth]
Sep 30 18:14:33 compute-1 nova_compute[238822]: 2025-09-30 18:14:33.126 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:14:33 compute-1 nova_compute[238822]: 2025-09-30 18:14:33.127 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:33 compute-1 nova_compute[238822]: 2025-09-30 18:14:33.127 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:14:33 compute-1 ceph-mon[75484]: pgmap v1074: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 31 KiB/s wr, 75 op/s
Sep 30 18:14:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:33.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:33.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:34 compute-1 nova_compute[238822]: 2025-09-30 18:14:34.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:34 compute-1 ceph-mon[75484]: pgmap v1075: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 32 KiB/s wr, 75 op/s
Sep 30 18:14:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:35.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:35 compute-1 podman[249638]: time="2025-09-30T18:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:14:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:14:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9279 "" "Go-http-client/1.1"
Sep 30 18:14:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.097 2 DEBUG nova.compute.manager [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6169
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.630 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.630 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.630 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.630 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.634 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.634 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:36 compute-1 nova_compute[238822]: 2025-09-30 18:14:36.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:36 compute-1 ovn_controller[135204]: 2025-09-30T18:14:36Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:09:fa 10.100.0.4
Sep 30 18:14:36 compute-1 ovn_controller[135204]: 2025-09-30T18:14:36Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:09:fa 10.100.0.4
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.146 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'pci_requests' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:37 compute-1 ceph-mon[75484]: pgmap v1076: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 4.7 KiB/s wr, 64 op/s
Sep 30 18:14:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1819809264' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:14:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1819809264' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.344 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Creating tmpfile /var/lib/nova/instances/tmphwjwr5f6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.346 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:37.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.658 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.659 2 INFO nova.compute.claims [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.659 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'resources' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.702 2 DEBUG nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphwjwr5f6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.720 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:37 compute-1 nova_compute[238822]: 2025-09-30 18:14:37.720 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:14:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:37.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:14:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.170 2 DEBUG nova.objects.base [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Object Instance<28ad2702-2baf-4865-be24-c468842cee03> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.171 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'numa_topology' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.227 2 INFO nova.compute.rpcapi [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.228 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.678 2 DEBUG nova.objects.base [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Object Instance<28ad2702-2baf-4865-be24-c468842cee03> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 18:14:38 compute-1 nova_compute[238822]: 2025-09-30 18:14:38.679 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'pci_devices' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:39 compute-1 nova_compute[238822]: 2025-09-30 18:14:39.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:39 compute-1 nova_compute[238822]: 2025-09-30 18:14:39.185 2 DEBUG nova.objects.base [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Object Instance<28ad2702-2baf-4865-be24-c468842cee03> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 18:14:39 compute-1 ceph-mon[75484]: pgmap v1077: 353 pgs: 353 active+clean; 486 MiB data, 447 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 4.7 KiB/s wr, 64 op/s
Sep 30 18:14:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:39.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:39 compute-1 nova_compute[238822]: 2025-09-30 18:14:39.700 2 INFO nova.compute.resource_tracker [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updating resource usage from migration 1ba9a402-a52e-4c0a-9289-488387639d69
Sep 30 18:14:39 compute-1 nova_compute[238822]: 2025-09-30 18:14:39.702 2 DEBUG nova.compute.resource_tracker [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Starting to track incoming migration 1ba9a402-a52e-4c0a-9289-488387639d69 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:14:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:39.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:40 compute-1 nova_compute[238822]: 2025-09-30 18:14:40.243 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:40 compute-1 nova_compute[238822]: 2025-09-30 18:14:40.320 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:14:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3615147311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:40 compute-1 nova_compute[238822]: 2025-09-30 18:14:40.845 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:40 compute-1 nova_compute[238822]: 2025-09-30 18:14:40.855 2 DEBUG nova.compute.provider_tree [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:14:41 compute-1 ceph-mon[75484]: pgmap v1078: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 130 op/s
Sep 30 18:14:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3615147311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:14:41 compute-1 nova_compute[238822]: 2025-09-30 18:14:41.369 2 DEBUG nova.scheduler.client.report [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:14:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:41.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:41.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:41 compute-1 nova_compute[238822]: 2025-09-30 18:14:41.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:41 compute-1 nova_compute[238822]: 2025-09-30 18:14:41.884 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 5.250s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:41 compute-1 nova_compute[238822]: 2025-09-30 18:14:41.885 2 INFO nova.compute.manager [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Migrating
Sep 30 18:14:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094004550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:43 compute-1 ceph-mon[75484]: pgmap v1079: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 334 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:14:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:43.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:43.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:44 compute-1 unix_chkpwd[272851]: password check failed for user (root)
Sep 30 18:14:44 compute-1 sshd-session[272848]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:14:44 compute-1 nova_compute[238822]: 2025-09-30 18:14:44.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:44 compute-1 ceph-mon[75484]: pgmap v1080: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 334 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:14:44 compute-1 nova_compute[238822]: 2025-09-30 18:14:44.271 2 DEBUG nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphwjwr5f6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='23ad643b-d29f-4fe8-a347-92df178ae0cd',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:14:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:45 compute-1 nova_compute[238822]: 2025-09-30 18:14:45.293 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:45 compute-1 nova_compute[238822]: 2025-09-30 18:14:45.293 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:45 compute-1 nova_compute[238822]: 2025-09-30 18:14:45.294 2 DEBUG nova.network.neutron [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:14:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:45.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:45 compute-1 sshd-session[272856]: Accepted publickey for nova from 192.168.122.100 port 35148 ssh2: ECDSA SHA256:O32sZJKX4Ovm79rbNdR7aUmA1e585fwvW0v+8EpVfIo
Sep 30 18:14:45 compute-1 nova_compute[238822]: 2025-09-30 18:14:45.800 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:45.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:45 compute-1 systemd-logind[789]: New session 58 of user nova.
Sep 30 18:14:45 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 18:14:45 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 18:14:45 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 18:14:45 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 18:14:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:45 compute-1 systemd[272860]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 18:14:46 compute-1 systemd[272860]: Queued start job for default target Main User Target.
Sep 30 18:14:46 compute-1 systemd[272860]: Created slice User Application Slice.
Sep 30 18:14:46 compute-1 systemd[272860]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 18:14:46 compute-1 systemd[272860]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 18:14:46 compute-1 systemd[272860]: Reached target Paths.
Sep 30 18:14:46 compute-1 systemd[272860]: Reached target Timers.
Sep 30 18:14:46 compute-1 systemd[272860]: Starting D-Bus User Message Bus Socket...
Sep 30 18:14:46 compute-1 systemd[272860]: Starting Create User's Volatile Files and Directories...
Sep 30 18:14:46 compute-1 sshd-session[272848]: Failed password for root from 107.172.146.104 port 57600 ssh2
Sep 30 18:14:46 compute-1 systemd[272860]: Finished Create User's Volatile Files and Directories.
Sep 30 18:14:46 compute-1 systemd[272860]: Listening on D-Bus User Message Bus Socket.
Sep 30 18:14:46 compute-1 systemd[272860]: Reached target Sockets.
Sep 30 18:14:46 compute-1 systemd[272860]: Reached target Basic System.
Sep 30 18:14:46 compute-1 systemd[272860]: Reached target Main User Target.
Sep 30 18:14:46 compute-1 systemd[272860]: Startup finished in 213ms.
Sep 30 18:14:46 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 18:14:46 compute-1 systemd[1]: Started Session 58 of User nova.
Sep 30 18:14:46 compute-1 sshd-session[272856]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 18:14:46 compute-1 sshd-session[272876]: Received disconnect from 192.168.122.100 port 35148:11: disconnected by user
Sep 30 18:14:46 compute-1 sshd-session[272876]: Disconnected from user nova 192.168.122.100 port 35148
Sep 30 18:14:46 compute-1 sshd-session[272856]: pam_unix(sshd:session): session closed for user nova
Sep 30 18:14:46 compute-1 systemd[1]: session-58.scope: Deactivated successfully.
Sep 30 18:14:46 compute-1 systemd-logind[789]: Session 58 logged out. Waiting for processes to exit.
Sep 30 18:14:46 compute-1 systemd-logind[789]: Removed session 58.
Sep 30 18:14:46 compute-1 sshd-session[272878]: Accepted publickey for nova from 192.168.122.100 port 35150 ssh2: ECDSA SHA256:O32sZJKX4Ovm79rbNdR7aUmA1e585fwvW0v+8EpVfIo
Sep 30 18:14:46 compute-1 systemd-logind[789]: New session 60 of user nova.
Sep 30 18:14:46 compute-1 systemd[1]: Started Session 60 of User nova.
Sep 30 18:14:46 compute-1 sshd-session[272878]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 18:14:46 compute-1 sshd-session[272881]: Received disconnect from 192.168.122.100 port 35150:11: disconnected by user
Sep 30 18:14:46 compute-1 sshd-session[272881]: Disconnected from user nova 192.168.122.100 port 35150
Sep 30 18:14:46 compute-1 sshd-session[272878]: pam_unix(sshd:session): session closed for user nova
Sep 30 18:14:46 compute-1 systemd[1]: session-60.scope: Deactivated successfully.
Sep 30 18:14:46 compute-1 systemd-logind[789]: Session 60 logged out. Waiting for processes to exit.
Sep 30 18:14:46 compute-1 systemd-logind[789]: Removed session 60.
Sep 30 18:14:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:46 compute-1 sshd-session[272848]: Received disconnect from 107.172.146.104 port 57600:11: Bye Bye [preauth]
Sep 30 18:14:46 compute-1 sshd-session[272848]: Disconnected from authenticating user root 107.172.146.104 port 57600 [preauth]
Sep 30 18:14:46 compute-1 nova_compute[238822]: 2025-09-30 18:14:46.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.044 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:47 compute-1 unix_chkpwd[272884]: password check failed for user (root)
Sep 30 18:14:47 compute-1 sshd-session[272853]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:14:47 compute-1 ceph-mon[75484]: pgmap v1081: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 334 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.342 2 DEBUG nova.network.neutron [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Updating instance_info_cache with network_info: [{"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:47.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:47 compute-1 podman[272886]: 2025-09-30 18:14:47.569716353 +0000 UTC m=+0.099830774 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:14:47 compute-1 podman[272885]: 2025-09-30 18:14:47.61613072 +0000 UTC m=+0.145351286 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:14:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:47.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.849 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.867 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphwjwr5f6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='23ad643b-d29f-4fe8-a347-92df178ae0cd',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.868 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Creating instance directory: /var/lib/nova/instances/23ad643b-d29f-4fe8-a347-92df178ae0cd pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.869 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Ensure instance console log exists: /var/lib/nova/instances/23ad643b-d29f-4fe8-a347-92df178ae0cd/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.870 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.871 2 DEBUG nova.virt.libvirt.vif [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:12:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-19459247',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-19459247',id=6,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:13:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-i5u830kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:13:15Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=23ad643b-d29f-4fe8-a347-92df178ae0cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.872 2 DEBUG nova.network.os_vif_util [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.873 2 DEBUG nova.network.os_vif_util [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.874 2 DEBUG os_vif [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.876 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.876 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.880 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '288f77af-2421-5067-9a5e-6eb1b0752a1f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.891 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50f7398a-76, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap50f7398a-76, col_values=(('qos', UUID('bd7b8687-8b9b-4058-9a1c-e9acc8c03536')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap50f7398a-76, col_values=(('external_ids', {'iface-id': '50f7398a-769c-4636-b498-5162fce10f7d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:86:73', 'vm-uuid': '23ad643b-d29f-4fe8-a347-92df178ae0cd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 NetworkManager[45549]: <info>  [1759256087.8949] manager: (tap50f7398a-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.903 2 INFO os_vif [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76')
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.904 2 DEBUG nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.904 2 DEBUG nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphwjwr5f6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='23ad643b-d29f-4fe8-a347-92df178ae0cd',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:14:47 compute-1 nova_compute[238822]: 2025-09-30 18:14:47.905 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.015 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.974 2 DEBUG nova.compute.manager [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.975 2 DEBUG oslo_concurrency.lockutils [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.975 2 DEBUG oslo_concurrency.lockutils [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.976 2 DEBUG oslo_concurrency.lockutils [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.976 2 DEBUG nova.compute.manager [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:14:48 compute-1 nova_compute[238822]: 2025-09-30 18:14:48.976 2 WARNING nova.compute.manager [req-31a36d08-c04e-4f5d-9128-2ff261e5cf94 req-3b487a49-d9f9-4c35-9086-288a159d79ea 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received unexpected event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with vm_state active and task_state resize_migrating.
Sep 30 18:14:49 compute-1 nova_compute[238822]: 2025-09-30 18:14:49.005 2 DEBUG nova.network.neutron [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Port 50f7398a-769c-4636-b498-5162fce10f7d updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:14:49 compute-1 nova_compute[238822]: 2025-09-30 18:14:49.017 2 DEBUG nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphwjwr5f6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='23ad643b-d29f-4fe8-a347-92df178ae0cd',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:14:49 compute-1 ceph-mon[75484]: pgmap v1082: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 334 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:14:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:49.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: ERROR   18:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: ERROR   18:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: ERROR   18:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: ERROR   18:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: ERROR   18:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:14:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:14:49 compute-1 sshd-session[272853]: Failed password for root from 192.210.160.141 port 34664 ssh2
Sep 30 18:14:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:49.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:50 compute-1 sshd-session[272853]: Connection closed by authenticating user root 192.210.160.141 port 34664 [preauth]
Sep 30 18:14:50 compute-1 sudo[272939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:14:50 compute-1 sudo[272939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:14:50 compute-1 sudo[272939]: pam_unix(sudo:session): session closed for user root
Sep 30 18:14:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.041 2 DEBUG nova.compute.manager [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.041 2 DEBUG oslo_concurrency.lockutils [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.041 2 DEBUG oslo_concurrency.lockutils [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.042 2 DEBUG oslo_concurrency.lockutils [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.042 2 DEBUG nova.compute.manager [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.042 2 WARNING nova.compute.manager [req-510962e9-b680-4da1-a234-ed63023f8ee6 req-00ff36e3-3b94-41b8-947d-d1c33a44b1b8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received unexpected event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with vm_state active and task_state resize_migrated.
Sep 30 18:14:51 compute-1 ceph-mon[75484]: pgmap v1083: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 336 KiB/s rd, 2.2 MiB/s wr, 69 op/s
Sep 30 18:14:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:51.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:51.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:51 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:14:51 compute-1 nova_compute[238822]: 2025-09-30 18:14:51.877 2 WARNING neutronclient.v2_0.client [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:51 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:14:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.017 2 INFO nova.network.neutron [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updating port b4130889-fd6e-44b4-8184-b79693b30d78 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 18:14:52 compute-1 kernel: tap50f7398a-76: entered promiscuous mode
Sep 30 18:14:52 compute-1 NetworkManager[45549]: <info>  [1759256092.0532] manager: (tap50f7398a-76): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Sep 30 18:14:52 compute-1 ovn_controller[135204]: 2025-09-30T18:14:52Z|00062|binding|INFO|Claiming lport 50f7398a-769c-4636-b498-5162fce10f7d for this additional chassis.
Sep 30 18:14:52 compute-1 ovn_controller[135204]: 2025-09-30T18:14:52Z|00063|binding|INFO|50f7398a-769c-4636-b498-5162fce10f7d: Claiming fa:16:3e:d1:86:73 10.100.0.9
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.063 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:86:73 10.100.0.9'], port_security=['fa:16:3e:d1:86:73 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '23ad643b-d29f-4fe8-a347-92df178ae0cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '10', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=50f7398a-769c-4636-b498-5162fce10f7d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.065 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 50f7398a-769c-4636-b498-5162fce10f7d in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.067 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:14:52 compute-1 ovn_controller[135204]: 2025-09-30T18:14:52Z|00064|binding|INFO|Setting lport 50f7398a-769c-4636-b498-5162fce10f7d ovn-installed in OVS
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.091 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3beccf20-10b6-4fa4-a198-e9721b250b36]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 systemd-machined[195911]: New machine qemu-5-instance-00000006.
Sep 30 18:14:52 compute-1 systemd-udevd[272998]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:14:52 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Sep 30 18:14:52 compute-1 NetworkManager[45549]: <info>  [1759256092.1255] device (tap50f7398a-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:14:52 compute-1 NetworkManager[45549]: <info>  [1759256092.1265] device (tap50f7398a-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.131 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fa812a-9e87-4911-9c62-bc8629257061]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.136 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[14102bb4-7ffe-4c20-8e55-c4387bf3f1eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.166 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0d525b-f08e-47df-a8db-073b1dd843fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.185 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[74682ec8-50a4-4fb9-ac2c-4afa170a5594]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273008, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.204 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ae494a13-6bd1-49cc-8f8e-a7423bca6f17]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273011, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273011, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.207 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:52 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.211 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.211 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.212 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.212 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:52 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:14:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:52.215 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[771c3460-4fc1-46e8-8f7b-4a23e6c9cee4]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.587 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.587 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.588 2 DEBUG nova.network.neutron [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:14:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.838 2 DEBUG nova.compute.manager [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-changed-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.838 2 DEBUG nova.compute.manager [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Refreshing instance network info cache due to event network-changed-b4130889-fd6e-44b4-8184-b79693b30d78. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.839 2 DEBUG oslo_concurrency.lockutils [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:52 compute-1 nova_compute[238822]: 2025-09-30 18:14:52.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:53 compute-1 nova_compute[238822]: 2025-09-30 18:14:53.094 2 WARNING neutronclient.v2_0.client [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:53 compute-1 ceph-mon[75484]: pgmap v1084: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 29 KiB/s wr, 3 op/s
Sep 30 18:14:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:14:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:53.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:53 compute-1 nova_compute[238822]: 2025-09-30 18:14:53.603 2 WARNING neutronclient.v2_0.client [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:53 compute-1 nova_compute[238822]: 2025-09-30 18:14:53.819 2 DEBUG nova.network.neutron [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updating instance_info_cache with network_info: [{"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:53.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.326 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.331 2 DEBUG oslo_concurrency.lockutils [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.331 2 DEBUG nova.network.neutron [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Refreshing network info cache for port b4130889-fd6e-44b4-8184-b79693b30d78 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:14:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:54.360 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:54.361 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:54.361 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:54 compute-1 ovn_controller[135204]: 2025-09-30T18:14:54Z|00065|binding|INFO|Claiming lport 50f7398a-769c-4636-b498-5162fce10f7d for this chassis.
Sep 30 18:14:54 compute-1 ovn_controller[135204]: 2025-09-30T18:14:54Z|00066|binding|INFO|50f7398a-769c-4636-b498-5162fce10f7d: Claiming fa:16:3e:d1:86:73 10.100.0.9
Sep 30 18:14:54 compute-1 ovn_controller[135204]: 2025-09-30T18:14:54Z|00067|binding|INFO|Setting lport 50f7398a-769c-4636-b498-5162fce10f7d up in Southbound
Sep 30 18:14:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.843 2 WARNING neutronclient.v2_0.client [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.886 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.888 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.888 2 INFO nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Creating image(s)
Sep 30 18:14:54 compute-1 nova_compute[238822]: 2025-09-30 18:14:54.931 2 DEBUG nova.storage.rbd_utils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] creating snapshot(nova-resize) on rbd image(28ad2702-2baf-4865-be24-c468842cee03_disk) create_snap /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:462
Sep 30 18:14:55 compute-1 ceph-mon[75484]: pgmap v1085: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 6.7 KiB/s rd, 29 KiB/s wr, 9 op/s
Sep 30 18:14:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e141 e141: 2 total, 2 up, 2 in
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.326 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:55.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.517 2 INFO nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Post operation of migration started
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.519 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:55 compute-1 podman[273096]: 2025-09-30 18:14:55.552117891 +0000 UTC m=+0.093650727 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:14:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:55.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.865 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.865 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:55 compute-1 nova_compute[238822]: 2025-09-30 18:14:55.988 2 WARNING neutronclient.v2_0.client [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.009 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.010 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.010 2 DEBUG nova.network.neutron [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:14:56 compute-1 ceph-mon[75484]: osdmap e141: 2 total, 2 up, 2 in
Sep 30 18:14:56 compute-1 ceph-mon[75484]: pgmap v1087: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 7.9 KiB/s rd, 35 KiB/s wr, 11 op/s
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.414 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.416 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Ensure instance console log exists: /var/lib/nova/instances/28ad2702-2baf-4865-be24-c468842cee03/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.418 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.418 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.419 2 DEBUG oslo_concurrency.lockutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.423 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Start _get_guest_xml network_info=[{"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.431 2 WARNING nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.434 2 DEBUG nova.virt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1899978059', uuid='28ad2702-2baf-4865-be24-c468842cee03'), owner=OwnerMeta(userid='dc3bb71c425f484fbc46f90978029403', username='tempest-TestExecuteActionsViaActuator-837729328-project-admin', projectid='ddd1f985d8b64b449c79d55b0cbd6422', projectname='tempest-TestExecuteActionsViaActuator-837729328'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256096.4341412) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.440 2 DEBUG nova.virt.libvirt.host [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.441 2 DEBUG nova.virt.libvirt.host [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.445 2 DEBUG nova.virt.libvirt.host [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.446 2 DEBUG nova.virt.libvirt.host [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.447 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.447 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.448 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.449 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.449 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.450 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.450 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.451 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.451 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.452 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.452 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.453 2 DEBUG nova.virt.hardware [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.453 2 DEBUG nova.objects.instance [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.516 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:56 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 18:14:56 compute-1 systemd[272860]: Activating special unit Exit the Session...
Sep 30 18:14:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped target Main User Target.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped target Basic System.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped target Paths.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped target Sockets.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped target Timers.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 18:14:56 compute-1 systemd[272860]: Closed D-Bus User Message Bus Socket.
Sep 30 18:14:56 compute-1 systemd[272860]: Stopped Create User's Volatile Files and Directories.
Sep 30 18:14:56 compute-1 systemd[272860]: Removed slice User Application Slice.
Sep 30 18:14:56 compute-1 systemd[272860]: Reached target Shutdown.
Sep 30 18:14:56 compute-1 systemd[272860]: Finished Exit the Session.
Sep 30 18:14:56 compute-1 systemd[272860]: Reached target Exit the Session.
Sep 30 18:14:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:56 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 18:14:56 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 18:14:56 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 18:14:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:56 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 18:14:56 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 18:14:56 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 18:14:56 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.820 2 DEBUG nova.network.neutron [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updated VIF entry in instance network info cache for port b4130889-fd6e-44b4-8184-b79693b30d78. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.821 2 DEBUG nova.network.neutron [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updating instance_info_cache with network_info: [{"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.963 2 DEBUG nova.objects.base [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Object Instance<28ad2702-2baf-4865-be24-c468842cee03> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 18:14:56 compute-1 nova_compute[238822]: 2025-09-30 18:14:56.965 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.233 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.328 2 DEBUG oslo_concurrency.lockutils [req-04de9f4b-aed9-4064-8434-ddc84bf4162b req-25378f8f-2948-4324-9b75-830fd7b7bf2e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-28ad2702-2baf-4865-be24-c468842cee03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:57.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.432 2 DEBUG nova.network.neutron [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Updating instance_info_cache with network_info: [{"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:14:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:14:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3833903488' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.490 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3833903488' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:14:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/809413651' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:14:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:14:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/809413651' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.555 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:14:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:14:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:57.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:57 compute-1 nova_compute[238822]: 2025-09-30 18:14:57.940 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-23ad643b-d29f-4fe8-a347-92df178ae0cd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:14:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:14:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3447825556' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.022 2 DEBUG oslo_concurrency.processutils [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.024 2 DEBUG nova.virt.libvirt.vif [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:13:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1899978059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1899978059',id=8,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:14:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-1d71qtf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:14:50Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=28ad2702-2baf-4865-be24-c468842cee03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.024 2 DEBUG nova.network.os_vif_util [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.025 2 DEBUG nova.network.os_vif_util [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.029 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <uuid>28ad2702-2baf-4865-be24-c468842cee03</uuid>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <name>instance-00000008</name>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1899978059</nova:name>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:14:56</nova:creationTime>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_input_bus">usb</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_machine_type">q35</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_video_model">virtio</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:property name="hw_vif_model">virtio</nova:property>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:user uuid="dc3bb71c425f484fbc46f90978029403">tempest-TestExecuteActionsViaActuator-837729328-project-admin</nova:user>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:project uuid="ddd1f985d8b64b449c79d55b0cbd6422">tempest-TestExecuteActionsViaActuator-837729328</nova:project>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <nova:port uuid="b4130889-fd6e-44b4-8184-b79693b30d78">
Sep 30 18:14:58 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <system>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="serial">28ad2702-2baf-4865-be24-c468842cee03</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="uuid">28ad2702-2baf-4865-be24-c468842cee03</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </system>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <os>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </os>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <features>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </features>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/28ad2702-2baf-4865-be24-c468842cee03_disk">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/28ad2702-2baf-4865-be24-c468842cee03_disk.config">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:14:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:f3:96:49"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <target dev="tapb4130889-fd"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/28ad2702-2baf-4865-be24-c468842cee03/console.log" append="off"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <video>
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </video>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:14:58 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:14:58 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:14:58 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:14:58 compute-1 nova_compute[238822]: </domain>
Sep 30 18:14:58 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.030 2 DEBUG nova.virt.libvirt.vif [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:13:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1899978059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1899978059',id=8,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:14:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-1d71qtf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:14:50Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=28ad2702-2baf-4865-be24-c468842cee03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.030 2 DEBUG nova.network.os_vif_util [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "vif_mac": "fa:16:3e:f3:96:49"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.031 2 DEBUG nova.network.os_vif_util [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.031 2 DEBUG os_vif [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e5ecaaf6-2a32-542c-8ce0-24a04b92abe1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.040 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb4130889-fd, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.040 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb4130889-fd, col_values=(('qos', UUID('01fc1660-7c7f-4cb1-ba75-cb152b1a151f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.040 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb4130889-fd, col_values=(('external_ids', {'iface-id': 'b4130889-fd6e-44b4-8184-b79693b30d78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:96:49', 'vm-uuid': '28ad2702-2baf-4865-be24-c468842cee03'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 NetworkManager[45549]: <info>  [1759256098.0430] manager: (tapb4130889-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.050 2 INFO os_vif [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd')
Sep 30 18:14:58 compute-1 unix_chkpwd[273223]: password check failed for user (root)
Sep 30 18:14:58 compute-1 sshd-session[273153]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.470 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.471 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.472 2 DEBUG oslo_concurrency.lockutils [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:14:58 compute-1 nova_compute[238822]: 2025-09-30 18:14:58.479 2 INFO nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:14:58 compute-1 virtqemud[239124]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 18:14:58 compute-1 virtqemud[239124]: hostname: compute-1
Sep 30 18:14:58 compute-1 virtqemud[239124]: Domain id=5 name='instance-00000006' uuid=23ad643b-d29f-4fe8-a347-92df178ae0cd is tainted: custom-monitor
Sep 30 18:14:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/809413651' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:14:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/809413651' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:14:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3447825556' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:14:58 compute-1 ceph-mon[75484]: pgmap v1088: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 7.9 KiB/s rd, 35 KiB/s wr, 11 op/s
Sep 30 18:14:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:14:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:14:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:14:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:14:59.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.489 2 INFO nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.595 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.595 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.595 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No VIF found with MAC fa:16:3e:f3:96:49, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.596 2 INFO nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Using config drive
Sep 30 18:14:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:14:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:59 compute-1 kernel: tapb4130889-fd: entered promiscuous mode
Sep 30 18:14:59 compute-1 NetworkManager[45549]: <info>  [1759256099.7154] manager: (tapb4130889-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Sep 30 18:14:59 compute-1 ovn_controller[135204]: 2025-09-30T18:14:59Z|00068|binding|INFO|Claiming lport b4130889-fd6e-44b4-8184-b79693b30d78 for this chassis.
Sep 30 18:14:59 compute-1 ovn_controller[135204]: 2025-09-30T18:14:59Z|00069|binding|INFO|b4130889-fd6e-44b4-8184-b79693b30d78: Claiming fa:16:3e:f3:96:49 10.100.0.6
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.728 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:96:49 10.100.0.6'], port_security=['fa:16:3e:f3:96:49 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '28ad2702-2baf-4865-be24-c468842cee03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '9', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=b4130889-fd6e-44b4-8184-b79693b30d78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:14:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:14:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.729 144543 INFO neutron.agent.ovn.metadata.agent [-] Port b4130889-fd6e-44b4-8184-b79693b30d78 in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab bound to our chassis
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.731 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:14:59 compute-1 systemd-udevd[273255]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:14:59 compute-1 ovn_controller[135204]: 2025-09-30T18:14:59Z|00070|binding|INFO|Setting lport b4130889-fd6e-44b4-8184-b79693b30d78 ovn-installed in OVS
Sep 30 18:14:59 compute-1 ovn_controller[135204]: 2025-09-30T18:14:59Z|00071|binding|INFO|Setting lport b4130889-fd6e-44b4-8184-b79693b30d78 up in Southbound
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.760 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c66d54-7562-4465-8554-333062008c1f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 NetworkManager[45549]: <info>  [1759256099.7695] device (tapb4130889-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:14:59 compute-1 NetworkManager[45549]: <info>  [1759256099.7702] device (tapb4130889-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:14:59 compute-1 systemd-machined[195911]: New machine qemu-6-instance-00000008.
Sep 30 18:14:59 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.805 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[cb73aeee-e095-4012-97f6-eb2bcd57f440]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.808 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[dde69836-634d-4179-b862-c0a4b753a6eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:14:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:14:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:14:59.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.848 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[6c421b4b-ebae-4406-806a-f35f1efe32a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.869 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8296e5-a2c2-42dc-b39e-22d96a53270a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 12, 'rx_bytes': 1630, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 12, 'rx_bytes': 1630, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273269, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.893 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[60e17715-5c1d-4514-a72e-9f8e1b6d17f6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273270, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273270, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.896 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:59 compute-1 nova_compute[238822]: 2025-09-30 18:14:59.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.901 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.902 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.902 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.902 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:14:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:14:59.904 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5451620d-a3aa-48dd-9f76-728f98d8fc2b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:14:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:14:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.160 2 DEBUG nova.compute.manager [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.161 2 DEBUG oslo_concurrency.lockutils [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.161 2 DEBUG oslo_concurrency.lockutils [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.161 2 DEBUG oslo_concurrency.lockutils [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.168 2 DEBUG nova.compute.manager [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.169 2 WARNING nova.compute.manager [req-7ef3efbc-d7af-4cf9-9d1f-26fa4543b221 req-29b25f26-a82b-4576-a599-34ef0558ca6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received unexpected event network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with vm_state active and task_state resize_finish.
Sep 30 18:15:00 compute-1 sshd-session[273153]: Failed password for root from 103.153.190.105 port 37514 ssh2
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.495 2 INFO nova.virt.libvirt.driver [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:15:00 compute-1 nova_compute[238822]: 2025-09-30 18:15:00.502 2 DEBUG nova.compute.manager [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:15:00 compute-1 podman[273273]: 2025-09-30 18:15:00.54815594 +0000 UTC m=+0.083107561 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:15:00 compute-1 podman[273282]: 2025-09-30 18:15:00.551749877 +0000 UTC m=+0.084293213 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:15:00 compute-1 podman[273281]: 2025-09-30 18:15:00.554267846 +0000 UTC m=+0.088854837 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Sep 30 18:15:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.017 2 DEBUG nova.objects.instance [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.100 2 DEBUG nova.compute.manager [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.106 2 INFO nova.virt.libvirt.driver [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Instance running successfully.
Sep 30 18:15:01 compute-1 virtqemud[239124]: argument unsupported: QEMU guest agent is not configured
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.109 2 DEBUG nova.virt.libvirt.guest [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.109 2 DEBUG nova.virt.libvirt.driver [None req-9b5e76aa-f784-4d18-9b10-b48d08620a4e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Sep 30 18:15:01 compute-1 sshd-session[273153]: Received disconnect from 103.153.190.105 port 37514:11: Bye Bye [preauth]
Sep 30 18:15:01 compute-1 sshd-session[273153]: Disconnected from authenticating user root 103.153.190.105 port 37514 [preauth]
Sep 30 18:15:01 compute-1 ceph-mon[75484]: pgmap v1089: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 KiB/s wr, 25 op/s
Sep 30 18:15:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:01.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:01.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:01 compute-1 nova_compute[238822]: 2025-09-30 18:15:01.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.037 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.301 2 DEBUG nova.compute.manager [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.302 2 DEBUG oslo_concurrency.lockutils [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.302 2 DEBUG oslo_concurrency.lockutils [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.302 2 DEBUG oslo_concurrency.lockutils [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.302 2 DEBUG nova.compute.manager [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.303 2 WARNING nova.compute.manager [req-8d956e17-8bf7-4b5c-9958-64a48d45da32 req-631c0d69-65b2-49d8-a6c8-895441989a06 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received unexpected event network-vif-plugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with vm_state resized and task_state None.
Sep 30 18:15:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.824 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:02 compute-1 nova_compute[238822]: 2025-09-30 18:15:02.824 2 WARNING neutronclient.v2_0.client [None req-130589fd-3f3b-4e0f-9ec5-747583c89756 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:03 compute-1 nova_compute[238822]: 2025-09-30 18:15:03.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:03 compute-1 ceph-mon[75484]: pgmap v1090: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 KiB/s wr, 25 op/s
Sep 30 18:15:03 compute-1 sudo[273376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:15:03 compute-1 sudo[273376]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:03 compute-1 sudo[273376]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:03 compute-1 sudo[273401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:15:03 compute-1 sudo[273401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:03.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:03.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:04 compute-1 sudo[273401]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:15:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:15:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:05 compute-1 ceph-mon[75484]: pgmap v1091: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 1.8 KiB/s wr, 105 op/s
Sep 30 18:15:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3556645831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:05.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:05 compute-1 podman[249638]: time="2025-09-30T18:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:15:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:15:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9287 "" "Go-http-client/1.1"
Sep 30 18:15:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:05.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:06 compute-1 nova_compute[238822]: 2025-09-30 18:15:06.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:07 compute-1 ceph-mon[75484]: pgmap v1092: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.7 KiB/s wr, 96 op/s
Sep 30 18:15:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:07.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:07.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:08 compute-1 nova_compute[238822]: 2025-09-30 18:15:08.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:15:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:08 compute-1 sshd-session[273463]: Invalid user hex from 216.10.242.161 port 56662
Sep 30 18:15:08 compute-1 sshd-session[273463]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:15:08 compute-1 sshd-session[273463]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:15:09 compute-1 sudo[273467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:15:09 compute-1 sudo[273467]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:09 compute-1 sudo[273467]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:09 compute-1 ceph-mon[75484]: pgmap v1093: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.5 KiB/s wr, 87 op/s
Sep 30 18:15:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3333474161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:15:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:15:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:09.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:09.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e142 e142: 2 total, 2 up, 2 in
Sep 30 18:15:10 compute-1 ceph-mon[75484]: pgmap v1094: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 3.8 KiB/s wr, 88 op/s
Sep 30 18:15:10 compute-1 ceph-mon[75484]: osdmap e142: 2 total, 2 up, 2 in
Sep 30 18:15:10 compute-1 sudo[273494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:15:10 compute-1 sudo[273494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:10 compute-1 sudo[273494]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:11 compute-1 sshd-session[273463]: Failed password for invalid user hex from 216.10.242.161 port 56662 ssh2
Sep 30 18:15:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3685291908' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:11.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:11.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:11 compute-1 nova_compute[238822]: 2025-09-30 18:15:11.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:11 compute-1 sshd-session[273463]: Received disconnect from 216.10.242.161 port 56662:11: Bye Bye [preauth]
Sep 30 18:15:11 compute-1 sshd-session[273463]: Disconnected from invalid user hex 216.10.242.161 port 56662 [preauth]
Sep 30 18:15:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:11 compute-1 unix_chkpwd[273521]: password check failed for user (root)
Sep 30 18:15:11 compute-1 sshd-session[273492]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:15:12 compute-1 ceph-mon[75484]: pgmap v1096: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 3.0 KiB/s wr, 87 op/s
Sep 30 18:15:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:13 compute-1 nova_compute[238822]: 2025-09-30 18:15:13.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:13.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:15:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:13.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:15:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:14 compute-1 sshd-session[273492]: Failed password for root from 192.210.160.141 port 37228 ssh2
Sep 30 18:15:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:14 compute-1 ceph-mon[75484]: pgmap v1097: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 421 KiB/s rd, 17 KiB/s wr, 51 op/s
Sep 30 18:15:14 compute-1 ovn_controller[135204]: 2025-09-30T18:15:14Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:96:49 10.100.0.6
Sep 30 18:15:15 compute-1 sshd-session[273492]: Connection closed by authenticating user root 192.210.160.141 port 37228 [preauth]
Sep 30 18:15:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:15.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 e143: 2 total, 2 up, 2 in
Sep 30 18:15:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:15:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:15.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:15:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:16 compute-1 ceph-mon[75484]: osdmap e143: 2 total, 2 up, 2 in
Sep 30 18:15:16 compute-1 ceph-mon[75484]: pgmap v1099: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 526 KiB/s rd, 22 KiB/s wr, 64 op/s
Sep 30 18:15:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:16 compute-1 nova_compute[238822]: 2025-09-30 18:15:16.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:17.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:17.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:18 compute-1 sshd-session[273530]: Invalid user sales from 84.51.43.58 port 43992
Sep 30 18:15:18 compute-1 sshd-session[273530]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:15:18 compute-1 sshd-session[273530]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:15:18 compute-1 sshd-session[273533]: Invalid user hex from 14.225.167.110 port 52308
Sep 30 18:15:18 compute-1 sshd-session[273533]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:15:18 compute-1 sshd-session[273533]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:15:18 compute-1 podman[273537]: 2025-09-30 18:15:18.317242503 +0000 UTC m=+0.106774932 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:15:18 compute-1 podman[273536]: 2025-09-30 18:15:18.385027058 +0000 UTC m=+0.179096630 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.477 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.477 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.478 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.478 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.478 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:18 compute-1 nova_compute[238822]: 2025-09-30 18:15:18.502 2 INFO nova.compute.manager [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Terminating instance
Sep 30 18:15:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.022 2 DEBUG nova.compute.manager [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:15:19 compute-1 kernel: tap2b9945fb-1c (unregistering): left promiscuous mode
Sep 30 18:15:19 compute-1 NetworkManager[45549]: <info>  [1759256119.0868] device (tap2b9945fb-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:15:19 compute-1 ovn_controller[135204]: 2025-09-30T18:15:19Z|00072|binding|INFO|Releasing lport 2b9945fb-1c9f-4952-9d4f-176df1016c31 from this chassis (sb_readonly=0)
Sep 30 18:15:19 compute-1 ovn_controller[135204]: 2025-09-30T18:15:19Z|00073|binding|INFO|Setting lport 2b9945fb-1c9f-4952-9d4f-176df1016c31 down in Southbound
Sep 30 18:15:19 compute-1 ovn_controller[135204]: 2025-09-30T18:15:19Z|00074|binding|INFO|Removing iface tap2b9945fb-1c ovn-installed in OVS
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.148 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:09:fa 10.100.0.4'], port_security=['fa:16:3e:33:09:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a17c77e1-0404-4b3e-b04a-7d5a03566e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '5', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=2b9945fb-1c9f-4952-9d4f-176df1016c31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.150 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 2b9945fb-1c9f-4952-9d4f-176df1016c31 in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.152 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 ceph-mon[75484]: pgmap v1100: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 526 KiB/s rd, 18 KiB/s wr, 63 op/s
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.186 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ba650f2a-3fd2-4e9c-91dd-d026a2d6155f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Sep 30 18:15:19 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 16.131s CPU time.
Sep 30 18:15:19 compute-1 systemd-machined[195911]: Machine qemu-4-instance-00000009 terminated.
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.236 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[325830c8-e957-4b7c-b7f8-77beb6958823]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.242 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[63850f56-800d-4ee2-bef1-1854b361625b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.268 2 INFO nova.virt.libvirt.driver [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Instance destroyed successfully.
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.269 2 DEBUG nova.objects.instance [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'resources' on Instance uuid a17c77e1-0404-4b3e-b04a-7d5a03566e47 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.280 2 DEBUG nova.compute.manager [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.281 2 DEBUG oslo_concurrency.lockutils [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.281 2 DEBUG oslo_concurrency.lockutils [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.281 2 DEBUG oslo_concurrency.lockutils [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.282 2 DEBUG nova.compute.manager [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] No waiting events found dispatching network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.282 2 DEBUG nova.compute.manager [req-cbb72d8c-b3a2-4521-8e89-de3d62389aba req-861043e1-b102-44c3-81a4-c223c4fc4689 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.285 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc18287-f9bb-401d-aa4c-70bd81e20ecc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.320 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c73f1f-f507-48f3-a9d7-bf4bb2b8cfba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 14, 'rx_bytes': 1924, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 14, 'rx_bytes': 1924, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273607, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.344 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b469fd-748e-4b01-8461-4da737d3e2e2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273608, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273608, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.345 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.355 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.355 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.355 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.355 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:19 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:19.357 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ba91beff-800b-44ca-b7e7-58617cf7204f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: ERROR   18:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: ERROR   18:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: ERROR   18:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: ERROR   18:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: ERROR   18:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:15:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:15:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:19.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:19 compute-1 sshd-session[273530]: Failed password for invalid user sales from 84.51.43.58 port 43992 ssh2
Sep 30 18:15:19 compute-1 sshd-session[273533]: Failed password for invalid user hex from 14.225.167.110 port 52308 ssh2
Sep 30 18:15:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/181519 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:15:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:19 compute-1 sshd-session[273530]: Received disconnect from 84.51.43.58 port 43992:11: Bye Bye [preauth]
Sep 30 18:15:19 compute-1 sshd-session[273530]: Disconnected from invalid user sales 84.51.43.58 port 43992 [preauth]
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.778 2 DEBUG nova.virt.libvirt.vif [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:14:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-673327438',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-673327438',id=9,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:14:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-pjsyk5hs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:14:24Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=a17c77e1-0404-4b3e-b04a-7d5a03566e47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.779 2 DEBUG nova.network.os_vif_util [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "address": "fa:16:3e:33:09:fa", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b9945fb-1c", "ovs_interfaceid": "2b9945fb-1c9f-4952-9d4f-176df1016c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.780 2 DEBUG nova.network.os_vif_util [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.781 2 DEBUG os_vif [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.785 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b9945fb-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.791 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=04a1b748-1af6-4086-b282-a5287206f5be) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:19 compute-1 nova_compute[238822]: 2025-09-30 18:15:19.797 2 INFO os_vif [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:09:fa,bridge_name='br-int',has_traffic_filtering=True,id=2b9945fb-1c9f-4952-9d4f-176df1016c31,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b9945fb-1c')
Sep 30 18:15:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:19.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:19 compute-1 sshd-session[273533]: Received disconnect from 14.225.167.110 port 52308:11: Bye Bye [preauth]
Sep 30 18:15:19 compute-1 sshd-session[273533]: Disconnected from invalid user hex 14.225.167.110 port 52308 [preauth]
Sep 30 18:15:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.263 2 INFO nova.virt.libvirt.driver [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Deleting instance files /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47_del
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.264 2 INFO nova.virt.libvirt.driver [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Deletion of /var/lib/nova/instances/a17c77e1-0404-4b3e-b04a-7d5a03566e47_del complete
Sep 30 18:15:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.777 2 INFO nova.compute.manager [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Took 1.75 seconds to destroy the instance on the hypervisor.
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.777 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.778 2 DEBUG nova.compute.manager [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.778 2 DEBUG nova.network.neutron [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:15:20 compute-1 nova_compute[238822]: 2025-09-30 18:15:20.778 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:21 compute-1 ceph-mon[75484]: pgmap v1101: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 659 KiB/s rd, 15 KiB/s wr, 66 op/s
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.360 2 DEBUG nova.compute.manager [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.361 2 DEBUG oslo_concurrency.lockutils [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.361 2 DEBUG oslo_concurrency.lockutils [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.361 2 DEBUG oslo_concurrency.lockutils [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.361 2 DEBUG nova.compute.manager [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] No waiting events found dispatching network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.362 2 DEBUG nova.compute.manager [req-2c475462-dd5d-4845-a219-5c31978849df req-b7b1a4d6-1bd6-425f-a296-b9c603a91cd5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-unplugged-2b9945fb-1c9f-4952-9d4f-176df1016c31 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:21.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.527 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:21.834 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:21.835 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:15:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:21.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:21 compute-1 ovn_controller[135204]: 2025-09-30T18:15:21Z|00075|binding|INFO|Releasing lport 3a8ea0a0-c179-4516-9404-04b68a17e79e from this chassis (sb_readonly=0)
Sep 30 18:15:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.941 2 DEBUG nova.compute.manager [req-f84e97d8-0bbd-4643-ab99-6b2355505848 req-d66d69d0-ab30-4783-a516-e6c93614a270 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Received event network-vif-deleted-2b9945fb-1c9f-4952-9d4f-176df1016c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.942 2 INFO nova.compute.manager [req-f84e97d8-0bbd-4643-ab99-6b2355505848 req-d66d69d0-ab30-4783-a516-e6c93614a270 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Neutron deleted interface 2b9945fb-1c9f-4952-9d4f-176df1016c31; detaching it from the instance and deleting it from the info cache
Sep 30 18:15:21 compute-1 nova_compute[238822]: 2025-09-30 18:15:21.942 2 DEBUG nova.network.neutron [req-f84e97d8-0bbd-4643-ab99-6b2355505848 req-d66d69d0-ab30-4783-a516-e6c93614a270 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:22 compute-1 nova_compute[238822]: 2025-09-30 18:15:22.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:22 compute-1 nova_compute[238822]: 2025-09-30 18:15:22.410 2 DEBUG nova.network.neutron [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:22 compute-1 nova_compute[238822]: 2025-09-30 18:15:22.455 2 DEBUG nova.compute.manager [req-f84e97d8-0bbd-4643-ab99-6b2355505848 req-d66d69d0-ab30-4783-a516-e6c93614a270 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Detach interface failed, port_id=2b9945fb-1c9f-4952-9d4f-176df1016c31, reason: Instance a17c77e1-0404-4b3e-b04a-7d5a03566e47 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:15:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:22 compute-1 nova_compute[238822]: 2025-09-30 18:15:22.917 2 INFO nova.compute.manager [-] [instance: a17c77e1-0404-4b3e-b04a-7d5a03566e47] Took 2.14 seconds to deallocate network for instance.
Sep 30 18:15:23 compute-1 ceph-mon[75484]: pgmap v1102: 353 pgs: 353 active+clean; 519 MiB data, 472 MiB used, 40 GiB / 40 GiB avail; 648 KiB/s rd, 15 KiB/s wr, 65 op/s
Sep 30 18:15:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:15:23 compute-1 nova_compute[238822]: 2025-09-30 18:15:23.443 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:23 compute-1 nova_compute[238822]: 2025-09-30 18:15:23.444 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:23.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:23 compute-1 nova_compute[238822]: 2025-09-30 18:15:23.555 2 DEBUG oslo_concurrency.processutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:23.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:24 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2844608554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:24 compute-1 nova_compute[238822]: 2025-09-30 18:15:24.028 2 DEBUG oslo_concurrency.processutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:24 compute-1 nova_compute[238822]: 2025-09-30 18:15:24.036 2 DEBUG nova.compute.provider_tree [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:15:24 compute-1 nova_compute[238822]: 2025-09-30 18:15:24.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2844608554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:24 compute-1 nova_compute[238822]: 2025-09-30 18:15:24.553 2 DEBUG nova.scheduler.client.report [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:15:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:24 compute-1 nova_compute[238822]: 2025-09-30 18:15:24.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:25 compute-1 nova_compute[238822]: 2025-09-30 18:15:25.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:25 compute-1 nova_compute[238822]: 2025-09-30 18:15:25.056 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:15:25 compute-1 nova_compute[238822]: 2025-09-30 18:15:25.065 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.621s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:25 compute-1 nova_compute[238822]: 2025-09-30 18:15:25.091 2 INFO nova.scheduler.client.report [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Deleted allocations for instance a17c77e1-0404-4b3e-b04a-7d5a03566e47
Sep 30 18:15:25 compute-1 ceph-mon[75484]: pgmap v1103: 353 pgs: 353 active+clean; 442 MiB data, 434 MiB used, 40 GiB / 40 GiB avail; 250 KiB/s rd, 13 KiB/s wr, 48 op/s
Sep 30 18:15:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:25.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:25.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.123 2 DEBUG oslo_concurrency.lockutils [None req-ef8fa380-c240-49d3-9327-79f8be69f50d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "a17c77e1-0404-4b3e-b04a-7d5a03566e47" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.646s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:26 compute-1 podman[273660]: 2025-09-30 18:15:26.529110233 +0000 UTC m=+0.070016176 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.603 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.603 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.604 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.604 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.605 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.618 2 INFO nova.compute.manager [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Terminating instance
Sep 30 18:15:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:26 compute-1 nova_compute[238822]: 2025-09-30 18:15:26.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.134 2 DEBUG nova.compute.manager [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:15:27 compute-1 kernel: tapb4130889-fd (unregistering): left promiscuous mode
Sep 30 18:15:27 compute-1 NetworkManager[45549]: <info>  [1759256127.1902] device (tapb4130889-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 ovn_controller[135204]: 2025-09-30T18:15:27Z|00076|binding|INFO|Releasing lport b4130889-fd6e-44b4-8184-b79693b30d78 from this chassis (sb_readonly=0)
Sep 30 18:15:27 compute-1 ovn_controller[135204]: 2025-09-30T18:15:27Z|00077|binding|INFO|Setting lport b4130889-fd6e-44b4-8184-b79693b30d78 down in Southbound
Sep 30 18:15:27 compute-1 ovn_controller[135204]: 2025-09-30T18:15:27Z|00078|binding|INFO|Removing iface tapb4130889-fd ovn-installed in OVS
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.212 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:96:49 10.100.0.6'], port_security=['fa:16:3e:f3:96:49 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '28ad2702-2baf-4865-be24-c468842cee03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '10', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=b4130889-fd6e-44b4-8184-b79693b30d78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.213 144543 INFO neutron.agent.ovn.metadata.agent [-] Port b4130889-fd6e-44b4-8184-b79693b30d78 in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.215 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.235 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b42dbff3-e8cd-466d-8e43-ad3c4e235e76]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 ceph-mon[75484]: pgmap v1104: 353 pgs: 353 active+clean; 442 MiB data, 434 MiB used, 40 GiB / 40 GiB avail; 240 KiB/s rd, 12 KiB/s wr, 46 op/s
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.286 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[92ace95b-5a1a-431f-a3b4-8b8d55b82a13]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.290 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[76e32c2e-5335-42db-b9c6-2afdb3fe39ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 14.991s CPU time.
Sep 30 18:15:27 compute-1 systemd-machined[195911]: Machine qemu-6-instance-00000008 terminated.
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.339 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d07ec753-9283-46bc-bbc8-d04548340145]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.360 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c381cf-d0be-4b4e-bf3d-108720ea84c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 16, 'rx_bytes': 2008, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 16, 'rx_bytes': 2008, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273692, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.377 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b03d6635-215d-41d3-99e1-b6db4077b3f1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273697, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273697, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.378 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.385 2 INFO nova.virt.libvirt.driver [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Instance destroyed successfully.
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.385 2 DEBUG nova.objects.instance [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'resources' on Instance uuid 28ad2702-2baf-4865-be24-c468842cee03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.387 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.388 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.388 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.389 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:27.390 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac311c0-ea4f-425b-b7a2-a0ced3b882a3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:15:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:27.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:15:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:27.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.894 2 DEBUG nova.virt.libvirt.vif [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:13:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1899978059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1899978059',id=8,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:15:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-1d71qtf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:15:13Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=28ad2702-2baf-4865-be24-c468842cee03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.895 2 DEBUG nova.network.os_vif_util [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "b4130889-fd6e-44b4-8184-b79693b30d78", "address": "fa:16:3e:f3:96:49", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4130889-fd", "ovs_interfaceid": "b4130889-fd6e-44b4-8184-b79693b30d78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.896 2 DEBUG nova.network.os_vif_util [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.896 2 DEBUG os_vif [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb4130889-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.906 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=01fc1660-7c7f-4cb1-ba75-cb152b1a151f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.911 2 INFO os_vif [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:96:49,bridge_name='br-int',has_traffic_filtering=True,id=b4130889-fd6e-44b4-8184-b79693b30d78,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4130889-fd')
Sep 30 18:15:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.947 2 DEBUG nova.compute.manager [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.948 2 DEBUG oslo_concurrency.lockutils [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.948 2 DEBUG oslo_concurrency.lockutils [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.949 2 DEBUG oslo_concurrency.lockutils [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.949 2 DEBUG nova.compute.manager [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:27 compute-1 nova_compute[238822]: 2025-09-30 18:15:27.950 2 DEBUG nova.compute.manager [req-70ff7a31-0525-4f5e-8ace-09889d7a93c6 req-28b851ae-a38f-46ba-9571-717ea4cf11ad 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:28 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4047398325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.491 2 INFO nova.virt.libvirt.driver [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Deleting instance files /var/lib/nova/instances/28ad2702-2baf-4865-be24-c468842cee03_del
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.493 2 INFO nova.virt.libvirt.driver [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Deletion of /var/lib/nova/instances/28ad2702-2baf-4865-be24-c468842cee03_del complete
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:15:28 compute-1 nova_compute[238822]: 2025-09-30 18:15:28.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:28.836 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.012 2 INFO nova.compute.manager [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Took 1.88 seconds to destroy the instance on the hypervisor.
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.013 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.013 2 DEBUG nova.compute.manager [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.013 2 DEBUG nova.network.neutron [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.014 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:29 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2554084427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.097 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:29 compute-1 ceph-mon[75484]: pgmap v1105: 353 pgs: 353 active+clean; 442 MiB data, 434 MiB used, 40 GiB / 40 GiB avail; 208 KiB/s rd, 11 KiB/s wr, 40 op/s
Sep 30 18:15:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2554084427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:15:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:29.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:29 compute-1 nova_compute[238822]: 2025-09-30 18:15:29.461 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:29.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.006 2 DEBUG nova.compute.manager [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.006 2 DEBUG oslo_concurrency.lockutils [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "28ad2702-2baf-4865-be24-c468842cee03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.007 2 DEBUG oslo_concurrency.lockutils [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.007 2 DEBUG oslo_concurrency.lockutils [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.007 2 DEBUG nova.compute.manager [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] No waiting events found dispatching network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.007 2 DEBUG nova.compute.manager [req-be87338c-ff14-442b-a900-1642bd7089d2 req-b612c9fe-a6df-4cbc-8201-db9d3bb670eb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-unplugged-b4130889-fd6e-44b4-8184-b79693b30d78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.159 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.159 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.164 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.164 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.168 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.168 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:15:30 compute-1 ceph-mon[75484]: pgmap v1106: 353 pgs: 353 active+clean; 361 MiB data, 429 MiB used, 40 GiB / 40 GiB avail; 221 KiB/s rd, 14 KiB/s wr, 59 op/s
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.450 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.452 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.490 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.491 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4121MB free_disk=39.76438522338867GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.491 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.491 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:30 compute-1 sudo[273751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:15:30 compute-1 sudo[273751]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:30 compute-1 sudo[273751]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:30 compute-1 nova_compute[238822]: 2025-09-30 18:15:30.564 2 DEBUG nova.network.neutron [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:31 compute-1 nova_compute[238822]: 2025-09-30 18:15:31.071 2 INFO nova.compute.manager [-] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Took 2.06 seconds to deallocate network for instance.
Sep 30 18:15:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2181834735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:31.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:31 compute-1 podman[273780]: 2025-09-30 18:15:31.577519741 +0000 UTC m=+0.101290293 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 18:15:31 compute-1 podman[273778]: 2025-09-30 18:15:31.583996336 +0000 UTC m=+0.109229888 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:15:31 compute-1 nova_compute[238822]: 2025-09-30 18:15:31.599 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:31 compute-1 podman[273779]: 2025-09-30 18:15:31.619848907 +0000 UTC m=+0.139979811 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Sep 30 18:15:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:31.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:31 compute-1 nova_compute[238822]: 2025-09-30 18:15:31.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.118 2 DEBUG nova.compute.manager [req-b92cbc5b-a49e-432b-8bd5-f34840df693f req-e88454df-a13c-47f9-ab50-c783c797a903 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 28ad2702-2baf-4865-be24-c468842cee03] Received event network-vif-deleted-b4130889-fd6e-44b4-8184-b79693b30d78 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.141 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance dadc55d4-1578-4dc1-880a-08098fba63ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.141 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance aa43d689-5cfc-489b-9635-36978f36b08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.142 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 23ad643b-d29f-4fe8-a347-92df178ae0cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.142 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 28ad2702-2baf-4865-be24-c468842cee03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.143 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.144 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=39GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:15:30 up  3:52,  0 user,  load average: 0.69, 0.97, 1.18\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '3', 'num_os_type_None': '4', 'num_proj_ddd1f985d8b64b449c79d55b0cbd6422': '4', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.195 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.242 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.242 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.261 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.284 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:15:32 compute-1 ceph-mon[75484]: pgmap v1107: 353 pgs: 353 active+clean; 361 MiB data, 429 MiB used, 40 GiB / 40 GiB avail; 31 KiB/s rd, 14 KiB/s wr, 47 op/s
Sep 30 18:15:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:15:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.431 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:32 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1674352271' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.878 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.884 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:15:32 compute-1 nova_compute[238822]: 2025-09-30 18:15:32.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1674352271' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:33 compute-1 nova_compute[238822]: 2025-09-30 18:15:33.393 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:15:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:33.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:15:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:15:33 compute-1 nova_compute[238822]: 2025-09-30 18:15:33.905 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:15:33 compute-1 nova_compute[238822]: 2025-09-30 18:15:33.905 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.414s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:33 compute-1 nova_compute[238822]: 2025-09-30 18:15:33.906 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 2.307s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:34 compute-1 nova_compute[238822]: 2025-09-30 18:15:34.009 2 DEBUG oslo_concurrency.processutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:34 compute-1 ceph-mon[75484]: pgmap v1108: 353 pgs: 353 active+clean; 360 MiB data, 383 MiB used, 40 GiB / 40 GiB avail; 39 KiB/s rd, 16 KiB/s wr, 58 op/s
Sep 30 18:15:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:34 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4161356519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:34 compute-1 nova_compute[238822]: 2025-09-30 18:15:34.475 2 DEBUG oslo_concurrency.processutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:34 compute-1 nova_compute[238822]: 2025-09-30 18:15:34.483 2 DEBUG nova.compute.provider_tree [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:15:34 compute-1 sshd-session[273657]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:15:34 compute-1 sshd-session[273657]: banner exchange: Connection from 113.249.93.94 port 11080: Connection timed out
Sep 30 18:15:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:34 compute-1 nova_compute[238822]: 2025-09-30 18:15:34.995 2 DEBUG nova.scheduler.client.report [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:15:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4161356519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:15:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:35.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:35 compute-1 nova_compute[238822]: 2025-09-30 18:15:35.512 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.606s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:35 compute-1 nova_compute[238822]: 2025-09-30 18:15:35.568 2 INFO nova.scheduler.client.report [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Deleted allocations for instance 28ad2702-2baf-4865-be24-c468842cee03
Sep 30 18:15:35 compute-1 podman[249638]: time="2025-09-30T18:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:15:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39665 "" "Go-http-client/1.1"
Sep 30 18:15:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9288 "" "Go-http-client/1.1"
Sep 30 18:15:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:35.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:35 compute-1 nova_compute[238822]: 2025-09-30 18:15:35.903 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:35 compute-1 nova_compute[238822]: 2025-09-30 18:15:35.904 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:36 compute-1 ceph-mon[75484]: pgmap v1109: 353 pgs: 353 active+clean; 360 MiB data, 383 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 4.8 KiB/s wr, 30 op/s
Sep 30 18:15:36 compute-1 nova_compute[238822]: 2025-09-30 18:15:36.418 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:36 compute-1 nova_compute[238822]: 2025-09-30 18:15:36.419 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:36 compute-1 nova_compute[238822]: 2025-09-30 18:15:36.419 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:15:36 compute-1 nova_compute[238822]: 2025-09-30 18:15:36.605 2 DEBUG oslo_concurrency.lockutils [None req-ed61c4b0-e5c4-45a7-b245-a0fb9c377a8d dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "28ad2702-2baf-4865-be24-c468842cee03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:36 compute-1 nova_compute[238822]: 2025-09-30 18:15:36.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:37 compute-1 sshd-session[273888]: Invalid user old from 194.107.115.65 port 16128
Sep 30 18:15:37 compute-1 sshd-session[273888]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:15:37 compute-1 sshd-session[273888]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:15:37 compute-1 sshd-session[273884]: Invalid user dev from 175.126.165.170 port 40296
Sep 30 18:15:37 compute-1 sshd-session[273884]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:15:37 compute-1 sshd-session[273884]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:15:37 compute-1 unix_chkpwd[273891]: password check failed for user (root)
Sep 30 18:15:37 compute-1 sshd-session[273883]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:15:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3076294399' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:15:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3076294399' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:15:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:15:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:37.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:37.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:37 compute-1 nova_compute[238822]: 2025-09-30 18:15:37.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:38 compute-1 ceph-mon[75484]: pgmap v1110: 353 pgs: 353 active+clean; 360 MiB data, 383 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 4.8 KiB/s wr, 30 op/s
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.468 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.469 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.470 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.470 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.470 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:38 compute-1 nova_compute[238822]: 2025-09-30 18:15:38.485 2 INFO nova.compute.manager [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Terminating instance
Sep 30 18:15:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:38 compute-1 sshd-session[273888]: Failed password for invalid user old from 194.107.115.65 port 16128 ssh2
Sep 30 18:15:38 compute-1 sshd-session[273884]: Failed password for invalid user dev from 175.126.165.170 port 40296 ssh2
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.005 2 DEBUG nova.compute.manager [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:15:39 compute-1 sshd-session[273883]: Failed password for root from 192.210.160.141 port 33534 ssh2
Sep 30 18:15:39 compute-1 kernel: tap9e86d507-89 (unregistering): left promiscuous mode
Sep 30 18:15:39 compute-1 NetworkManager[45549]: <info>  [1759256139.0825] device (tap9e86d507-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 ovn_controller[135204]: 2025-09-30T18:15:39Z|00079|binding|INFO|Releasing lport 9e86d507-897e-4992-a0d4-ef24306047ab from this chassis (sb_readonly=0)
Sep 30 18:15:39 compute-1 ovn_controller[135204]: 2025-09-30T18:15:39Z|00080|binding|INFO|Setting lport 9e86d507-897e-4992-a0d4-ef24306047ab down in Southbound
Sep 30 18:15:39 compute-1 ovn_controller[135204]: 2025-09-30T18:15:39Z|00081|binding|INFO|Removing iface tap9e86d507-89 ovn-installed in OVS
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.105 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:eb:b4 10.100.0.10'], port_security=['fa:16:3e:f4:eb:b4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'aa43d689-5cfc-489b-9635-36978f36b08c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '5', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=9e86d507-897e-4992-a0d4-ef24306047ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.106 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 9e86d507-897e-4992-a0d4-ef24306047ab in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.108 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.130 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8a22cc8a-f131-4298-abd8-e1cacf25fb43]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Sep 30 18:15:39 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 19.615s CPU time.
Sep 30 18:15:39 compute-1 systemd-machined[195911]: Machine qemu-3-instance-00000007 terminated.
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.178 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[e688ea33-6ef3-49aa-afd1-b29519214b10]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.182 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[dfea6af1-696a-4eef-b382-f09ca9689081]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.239 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3118d168-d2f8-4d35-9408-692a23f7ca5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.251 2 INFO nova.virt.libvirt.driver [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Instance destroyed successfully.
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.252 2 DEBUG nova.objects.instance [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'resources' on Instance uuid aa43d689-5cfc-489b-9635-36978f36b08c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.273 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a306dbf8-3f14-4071-bbf6-6eabbecf2f98]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 18, 'rx_bytes': 2008, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 18, 'rx_bytes': 2008, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273913, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.294 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bd94fc49-4428-4b15-8eba-126f9f008763]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273917, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273917, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.296 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.304 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.304 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.304 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.305 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:39.307 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[063ba04e-3059-46b9-8a50-855b855256a6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:39 compute-1 sshd-session[273884]: Received disconnect from 175.126.165.170 port 40296:11: Bye Bye [preauth]
Sep 30 18:15:39 compute-1 sshd-session[273884]: Disconnected from invalid user dev 175.126.165.170 port 40296 [preauth]
Sep 30 18:15:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:39.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.571 2 DEBUG nova.compute.manager [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.572 2 DEBUG oslo_concurrency.lockutils [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.572 2 DEBUG oslo_concurrency.lockutils [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.572 2 DEBUG oslo_concurrency.lockutils [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.573 2 DEBUG nova.compute.manager [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] No waiting events found dispatching network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.573 2 DEBUG nova.compute.manager [req-f3931569-19d6-4c82-b6ea-e192959a01f2 req-3f0d400e-2e18-45a7-89eb-def71803ee59 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.759 2 DEBUG nova.virt.libvirt.vif [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:13:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1527130227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1527130227',id=7,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:13:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-yipxsciv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:13:37Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=aa43d689-5cfc-489b-9635-36978f36b08c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.760 2 DEBUG nova.network.os_vif_util [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "9e86d507-897e-4992-a0d4-ef24306047ab", "address": "fa:16:3e:f4:eb:b4", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e86d507-89", "ovs_interfaceid": "9e86d507-897e-4992-a0d4-ef24306047ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.761 2 DEBUG nova.network.os_vif_util [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.762 2 DEBUG os_vif [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.765 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e86d507-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7cc5d503-2af9-4952-b7a1-828049921236) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:39 compute-1 nova_compute[238822]: 2025-09-30 18:15:39.777 2 INFO os_vif [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:eb:b4,bridge_name='br-int',has_traffic_filtering=True,id=9e86d507-897e-4992-a0d4-ef24306047ab,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e86d507-89')
Sep 30 18:15:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:39.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:40 compute-1 sshd-session[273883]: Connection closed by authenticating user root 192.210.160.141 port 33534 [preauth]
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.303 2 INFO nova.virt.libvirt.driver [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Deleting instance files /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c_del
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.304 2 INFO nova.virt.libvirt.driver [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Deletion of /var/lib/nova/instances/aa43d689-5cfc-489b-9635-36978f36b08c_del complete
Sep 30 18:15:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.817 2 INFO nova.compute.manager [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Took 1.81 seconds to destroy the instance on the hypervisor.
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.817 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.817 2 DEBUG nova.compute.manager [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.818 2 DEBUG nova.network.neutron [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.818 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:40 compute-1 sshd-session[273888]: Received disconnect from 194.107.115.65 port 16128:11: Bye Bye [preauth]
Sep 30 18:15:40 compute-1 sshd-session[273888]: Disconnected from invalid user old 194.107.115.65 port 16128 [preauth]
Sep 30 18:15:40 compute-1 nova_compute[238822]: 2025-09-30 18:15:40.961 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:41 compute-1 ceph-mon[75484]: pgmap v1111: 353 pgs: 353 active+clean; 360 MiB data, 383 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 4.9 KiB/s wr, 30 op/s
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.288 2 DEBUG nova.compute.manager [req-14c402a9-b31f-47a1-b756-134cd6186413 req-d77c65a8-9b18-4276-b9b0-f95df5cdeb7b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-deleted-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.288 2 INFO nova.compute.manager [req-14c402a9-b31f-47a1-b756-134cd6186413 req-d77c65a8-9b18-4276-b9b0-f95df5cdeb7b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Neutron deleted interface 9e86d507-897e-4992-a0d4-ef24306047ab; detaching it from the instance and deleting it from the info cache
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.288 2 DEBUG nova.network.neutron [req-14c402a9-b31f-47a1-b756-134cd6186413 req-d77c65a8-9b18-4276-b9b0-f95df5cdeb7b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.004000108s ======
Sep 30 18:15:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:41.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000108s
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.626 2 DEBUG nova.compute.manager [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.627 2 DEBUG oslo_concurrency.lockutils [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.627 2 DEBUG oslo_concurrency.lockutils [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.628 2 DEBUG oslo_concurrency.lockutils [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.628 2 DEBUG nova.compute.manager [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] No waiting events found dispatching network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.628 2 DEBUG nova.compute.manager [req-a3a53371-2409-4692-95be-a5ca09b38b93 req-5f9e9aa2-1c91-4dd9-88ba-acd5cf7881e6 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Received event network-vif-unplugged-9e86d507-897e-4992-a0d4-ef24306047ab for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/181541 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:15:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.744 2 DEBUG nova.network.neutron [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.796 2 DEBUG nova.compute.manager [req-14c402a9-b31f-47a1-b756-134cd6186413 req-d77c65a8-9b18-4276-b9b0-f95df5cdeb7b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Detach interface failed, port_id=9e86d507-897e-4992-a0d4-ef24306047ab, reason: Instance aa43d689-5cfc-489b-9635-36978f36b08c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:15:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:41.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:41 compute-1 nova_compute[238822]: 2025-09-30 18:15:41.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:42 compute-1 nova_compute[238822]: 2025-09-30 18:15:42.254 2 INFO nova.compute.manager [-] [instance: aa43d689-5cfc-489b-9635-36978f36b08c] Took 1.44 seconds to deallocate network for instance.
Sep 30 18:15:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:42 compute-1 nova_compute[238822]: 2025-09-30 18:15:42.779 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:42 compute-1 nova_compute[238822]: 2025-09-30 18:15:42.780 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:42 compute-1 nova_compute[238822]: 2025-09-30 18:15:42.854 2 DEBUG oslo_concurrency.processutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:43 compute-1 ceph-mon[75484]: pgmap v1112: 353 pgs: 353 active+clean; 360 MiB data, 383 MiB used, 40 GiB / 40 GiB avail; 8.1 KiB/s rd, 1.2 KiB/s wr, 11 op/s
Sep 30 18:15:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:43 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1018186174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:43 compute-1 nova_compute[238822]: 2025-09-30 18:15:43.341 2 DEBUG oslo_concurrency.processutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:43 compute-1 nova_compute[238822]: 2025-09-30 18:15:43.348 2 DEBUG nova.compute.provider_tree [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:15:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:15:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:43.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:15:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:43 compute-1 nova_compute[238822]: 2025-09-30 18:15:43.859 2 DEBUG nova.scheduler.client.report [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:15:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:43.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:44 compute-1 unix_chkpwd[273967]: password check failed for user (root)
Sep 30 18:15:44 compute-1 sshd-session[273964]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:15:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1018186174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:44 compute-1 nova_compute[238822]: 2025-09-30 18:15:44.373 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.593s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:44 compute-1 nova_compute[238822]: 2025-09-30 18:15:44.410 2 INFO nova.scheduler.client.report [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Deleted allocations for instance aa43d689-5cfc-489b-9635-36978f36b08c
Sep 30 18:15:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:44 compute-1 nova_compute[238822]: 2025-09-30 18:15:44.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:45 compute-1 ceph-mon[75484]: pgmap v1113: 353 pgs: 353 active+clean; 281 MiB data, 338 MiB used, 40 GiB / 40 GiB avail; 27 KiB/s rd, 2.4 KiB/s wr, 39 op/s
Sep 30 18:15:45 compute-1 nova_compute[238822]: 2025-09-30 18:15:45.443 2 DEBUG oslo_concurrency.lockutils [None req-84c70074-82f2-4270-a521-8bf434b9f2e1 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "aa43d689-5cfc-489b-9635-36978f36b08c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.973s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:45.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:45.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.205 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "23ad643b-d29f-4fe8-a347-92df178ae0cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.206 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.206 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.207 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.207 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.220 2 INFO nova.compute.manager [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Terminating instance
Sep 30 18:15:46 compute-1 sshd-session[273964]: Failed password for root from 107.172.146.104 port 55118 ssh2
Sep 30 18:15:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.737 2 DEBUG nova.compute.manager [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:15:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:46 compute-1 kernel: tap50f7398a-76 (unregistering): left promiscuous mode
Sep 30 18:15:46 compute-1 NetworkManager[45549]: <info>  [1759256146.7971] device (tap50f7398a-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:15:46 compute-1 ovn_controller[135204]: 2025-09-30T18:15:46Z|00082|binding|INFO|Releasing lport 50f7398a-769c-4636-b498-5162fce10f7d from this chassis (sb_readonly=0)
Sep 30 18:15:46 compute-1 ovn_controller[135204]: 2025-09-30T18:15:46Z|00083|binding|INFO|Setting lport 50f7398a-769c-4636-b498-5162fce10f7d down in Southbound
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 ovn_controller[135204]: 2025-09-30T18:15:46Z|00084|binding|INFO|Removing iface tap50f7398a-76 ovn-installed in OVS
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.818 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:86:73 10.100.0.9'], port_security=['fa:16:3e:d1:86:73 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '23ad643b-d29f-4fe8-a347-92df178ae0cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '15', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=50f7398a-769c-4636-b498-5162fce10f7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.819 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 50f7398a-769c-4636-b498-5162fce10f7d in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.820 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fff1904-159a-4b76-8c46-feabf17f29ab
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.838 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d8a742-8804-4419-bf6c-f9c722d500dd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 sshd-session[273964]: Received disconnect from 107.172.146.104 port 55118:11: Bye Bye [preauth]
Sep 30 18:15:46 compute-1 sshd-session[273964]: Disconnected from authenticating user root 107.172.146.104 port 55118 [preauth]
Sep 30 18:15:46 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Sep 30 18:15:46 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 4.015s CPU time.
Sep 30 18:15:46 compute-1 systemd-machined[195911]: Machine qemu-5-instance-00000006 terminated.
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.885 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[27bee7f7-9966-464d-a6d6-eac2555255e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.889 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ce14a445-a604-474b-b476-06c908ee49b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.927 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[6a482f66-ed15-414e-acca-825d9423a4da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.951 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e1697371-599f-4cce-b1f7-d8944961164d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fff1904-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:07:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 20, 'rx_bytes': 2008, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 20, 'rx_bytes': 2008, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377603, 'reachable_time': 37514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273982, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.975 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[00f9f4f2-49b0-4fcb-b601-ec5ff9d50eef]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377619, 'tstamp': 1377619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273985, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5fff1904-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1377623, 'tstamp': 1377623}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273985, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.977 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.978 2 INFO nova.virt.libvirt.driver [-] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Instance destroyed successfully.
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.978 2 DEBUG nova.objects.instance [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'resources' on Instance uuid 23ad643b-d29f-4fe8-a347-92df178ae0cd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 nova_compute[238822]: 2025-09-30 18:15:46.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.984 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fff1904-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.985 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.985 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fff1904-10, col_values=(('external_ids', {'iface-id': '3a8ea0a0-c179-4516-9404-04b68a17e79e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.985 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:15:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:46.986 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[386851e1-91b3-40cd-a132-a8b5d5dc53dc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-5fff1904-159a-4b76-8c46-feabf17f29ab\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 5fff1904-159a-4b76-8c46-feabf17f29ab\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.100 2 DEBUG nova.compute.manager [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Received event network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.100 2 DEBUG oslo_concurrency.lockutils [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.101 2 DEBUG oslo_concurrency.lockutils [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.101 2 DEBUG oslo_concurrency.lockutils [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.101 2 DEBUG nova.compute.manager [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] No waiting events found dispatching network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.101 2 DEBUG nova.compute.manager [req-dd7a21ea-9b7d-4cae-9c96-13176a90e8f2 req-12b5b750-ca73-4ac8-b0a8-ff007e8c9f58 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Received event network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:47 compute-1 ceph-mon[75484]: pgmap v1114: 353 pgs: 353 active+clean; 281 MiB data, 338 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:15:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.486 2 DEBUG nova.virt.libvirt.vif [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:12:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-19459247',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-19459247',id=6,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:13:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-i5u830kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:15:01Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=23ad643b-d29f-4fe8-a347-92df178ae0cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.486 2 DEBUG nova.network.os_vif_util [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "50f7398a-769c-4636-b498-5162fce10f7d", "address": "fa:16:3e:d1:86:73", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f7398a-76", "ovs_interfaceid": "50f7398a-769c-4636-b498-5162fce10f7d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:15:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:47.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.487 2 DEBUG nova.network.os_vif_util [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.487 2 DEBUG os_vif [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50f7398a-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bd7b8687-8b9b-4058-9a1c-e9acc8c03536) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.499 2 INFO os_vif [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:86:73,bridge_name='br-int',has_traffic_filtering=True,id=50f7398a-769c-4636-b498-5162fce10f7d,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f7398a-76')
Sep 30 18:15:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:47.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.952 2 INFO nova.virt.libvirt.driver [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Deleting instance files /var/lib/nova/instances/23ad643b-d29f-4fe8-a347-92df178ae0cd_del
Sep 30 18:15:47 compute-1 nova_compute[238822]: 2025-09-30 18:15:47.953 2 INFO nova.virt.libvirt.driver [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Deletion of /var/lib/nova/instances/23ad643b-d29f-4fe8-a347-92df178ae0cd_del complete
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.469 2 INFO nova.compute.manager [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Took 1.73 seconds to destroy the instance on the hypervisor.
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.469 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.470 2 DEBUG nova.compute.manager [-] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.470 2 DEBUG nova.network.neutron [-] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.470 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:48 compute-1 podman[274020]: 2025-09-30 18:15:48.541538386 +0000 UTC m=+0.073779428 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:15:48 compute-1 podman[274019]: 2025-09-30 18:15:48.57974439 +0000 UTC m=+0.106864853 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:15:48 compute-1 nova_compute[238822]: 2025-09-30 18:15:48.586 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.183 2 DEBUG nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Received event network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.184 2 DEBUG oslo_concurrency.lockutils [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.184 2 DEBUG oslo_concurrency.lockutils [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.185 2 DEBUG oslo_concurrency.lockutils [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.185 2 DEBUG nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] No waiting events found dispatching network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.185 2 DEBUG nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Received event network-vif-unplugged-50f7398a-769c-4636-b498-5162fce10f7d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.185 2 DEBUG nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Received event network-vif-deleted-50f7398a-769c-4636-b498-5162fce10f7d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.186 2 INFO nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Neutron deleted interface 50f7398a-769c-4636-b498-5162fce10f7d; detaching it from the instance and deleting it from the info cache
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.186 2 DEBUG nova.network.neutron [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:49 compute-1 ceph-mon[75484]: pgmap v1115: 353 pgs: 353 active+clean; 281 MiB data, 338 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.344 2 DEBUG nova.network.neutron [-] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: ERROR   18:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: ERROR   18:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: ERROR   18:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: ERROR   18:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: ERROR   18:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:15:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:15:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:49.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.696 2 DEBUG nova.compute.manager [req-2de00d21-3cf6-46b9-9de2-c23e94f8563a req-80c3a0b2-b08e-4047-8328-67cb1b52f3d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Detach interface failed, port_id=50f7398a-769c-4636-b498-5162fce10f7d, reason: Instance 23ad643b-d29f-4fe8-a347-92df178ae0cd could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:15:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:49 compute-1 nova_compute[238822]: 2025-09-30 18:15:49.851 2 INFO nova.compute.manager [-] [instance: 23ad643b-d29f-4fe8-a347-92df178ae0cd] Took 1.38 seconds to deallocate network for instance.
Sep 30 18:15:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:50 compute-1 ceph-mon[75484]: pgmap v1116: 353 pgs: 353 active+clean; 202 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.4 KiB/s wr, 55 op/s
Sep 30 18:15:50 compute-1 nova_compute[238822]: 2025-09-30 18:15:50.385 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:50 compute-1 nova_compute[238822]: 2025-09-30 18:15:50.386 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:50 compute-1 nova_compute[238822]: 2025-09-30 18:15:50.484 2 DEBUG oslo_concurrency.processutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:50 compute-1 sudo[274072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:15:50 compute-1 sudo[274072]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:15:50 compute-1 sudo[274072]: pam_unix(sudo:session): session closed for user root
Sep 30 18:15:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:15:50 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1071548273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:51 compute-1 nova_compute[238822]: 2025-09-30 18:15:51.019 2 DEBUG oslo_concurrency.processutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:15:51 compute-1 nova_compute[238822]: 2025-09-30 18:15:51.029 2 DEBUG nova.compute.provider_tree [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:15:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1071548273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:15:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:15:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:51.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:15:51 compute-1 nova_compute[238822]: 2025-09-30 18:15:51.541 2 DEBUG nova.scheduler.client.report [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:15:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:51 compute-1 nova_compute[238822]: 2025-09-30 18:15:51.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:52 compute-1 nova_compute[238822]: 2025-09-30 18:15:52.058 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.672s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:52 compute-1 nova_compute[238822]: 2025-09-30 18:15:52.087 2 INFO nova.scheduler.client.report [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Deleted allocations for instance 23ad643b-d29f-4fe8-a347-92df178ae0cd
Sep 30 18:15:52 compute-1 ceph-mon[75484]: pgmap v1117: 353 pgs: 353 active+clean; 202 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:15:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:15:52 compute-1 nova_compute[238822]: 2025-09-30 18:15:52.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:53 compute-1 nova_compute[238822]: 2025-09-30 18:15:53.132 2 DEBUG oslo_concurrency.lockutils [None req-5e2efc90-11ed-4568-9995-ea0da3d6843b dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "23ad643b-d29f-4fe8-a347-92df178ae0cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.926s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:53.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:53.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:54.363 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:54.363 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:54.364 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:55 compute-1 ceph-mon[75484]: pgmap v1118: 353 pgs: 353 active+clean; 202 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.495 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:55.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.496 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.497 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.497 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.498 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:55 compute-1 nova_compute[238822]: 2025-09-30 18:15:55.513 2 INFO nova.compute.manager [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Terminating instance
Sep 30 18:15:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:55.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.031 2 DEBUG nova.compute.manager [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:15:56 compute-1 kernel: tap93146cef-46 (unregistering): left promiscuous mode
Sep 30 18:15:56 compute-1 NetworkManager[45549]: <info>  [1759256156.1296] device (tap93146cef-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 ovn_controller[135204]: 2025-09-30T18:15:56Z|00085|binding|INFO|Releasing lport 93146cef-46ad-4383-892d-3ec355af507c from this chassis (sb_readonly=0)
Sep 30 18:15:56 compute-1 ovn_controller[135204]: 2025-09-30T18:15:56Z|00086|binding|INFO|Setting lport 93146cef-46ad-4383-892d-3ec355af507c down in Southbound
Sep 30 18:15:56 compute-1 ovn_controller[135204]: 2025-09-30T18:15:56Z|00087|binding|INFO|Removing iface tap93146cef-46 ovn-installed in OVS
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.157 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:61:22 10.100.0.8'], port_security=['fa:16:3e:0a:61:22 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'dadc55d4-1578-4dc1-880a-08098fba63ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fff1904-159a-4b76-8c46-feabf17f29ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddd1f985d8b64b449c79d55b0cbd6422', 'neutron:revision_number': '5', 'neutron:security_group_ids': '34f3cf7b-94cf-408f-b3dc-ae0b57c009fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12c18a77-b252-4a3e-a181-b42644879446, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=93146cef-46ad-4383-892d-3ec355af507c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.159 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 93146cef-46ad-4383-892d-3ec355af507c in datapath 5fff1904-159a-4b76-8c46-feabf17f29ab unbound from our chassis
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.160 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5fff1904-159a-4b76-8c46-feabf17f29ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.162 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[473fe298-310a-4269-816a-70894f187dfe]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.162 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab namespace which is not needed anymore
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Sep 30 18:15:56 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 25.812s CPU time.
Sep 30 18:15:56 compute-1 systemd-machined[195911]: Machine qemu-2-instance-00000005 terminated.
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.274 2 INFO nova.virt.libvirt.driver [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Instance destroyed successfully.
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.275 2 DEBUG nova.objects.instance [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lazy-loading 'resources' on Instance uuid dadc55d4-1578-4dc1-880a-08098fba63ea obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:15:56 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [NOTICE]   (270720) : haproxy version is 3.0.5-8e879a5
Sep 30 18:15:56 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [NOTICE]   (270720) : path to executable is /usr/sbin/haproxy
Sep 30 18:15:56 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [WARNING]  (270720) : Exiting Master process...
Sep 30 18:15:56 compute-1 podman[274160]: 2025-09-30 18:15:56.320875466 +0000 UTC m=+0.029851179 container kill 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 18:15:56 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [ALERT]    (270720) : Current worker (270722) exited with code 143 (Terminated)
Sep 30 18:15:56 compute-1 neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab[270716]: [WARNING]  (270720) : All workers exited. Exiting... (0)
Sep 30 18:15:56 compute-1 systemd[1]: libpod-73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e.scope: Deactivated successfully.
Sep 30 18:15:56 compute-1 podman[274176]: 2025-09-30 18:15:56.375377001 +0000 UTC m=+0.034945077 container died 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Sep 30 18:15:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e-userdata-shm.mount: Deactivated successfully.
Sep 30 18:15:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-ec3484ab28fd4f27b738728c98c6e32496c00bfab60b199ced2df9ecb614c481-merged.mount: Deactivated successfully.
Sep 30 18:15:56 compute-1 podman[274176]: 2025-09-30 18:15:56.476742874 +0000 UTC m=+0.136310860 container cleanup 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:15:56 compute-1 systemd[1]: libpod-conmon-73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e.scope: Deactivated successfully.
Sep 30 18:15:56 compute-1 podman[274184]: 2025-09-30 18:15:56.505163253 +0000 UTC m=+0.140854613 container remove 73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.510 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[237b7518-d729-4590-89b9-dffcf75399d6]: (4, ("Tue Sep 30 06:15:56 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab (73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e)\n73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e\nTue Sep 30 06:15:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab (73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e)\n73d47128e30f4a73321db1a2a638f57be9871c0dc2c6d01220babf294b45049e\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.511 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9404e7-28dc-4eb1-9084-8ba59463620a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.512 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fff1904-159a-4b76-8c46-feabf17f29ab.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.512 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[63cc32d1-467c-4d07-a266-3947db695a84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.513 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fff1904-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:56 compute-1 kernel: tap5fff1904-10: left promiscuous mode
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.539 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2ed0ec-a34a-48b6-ab84-6845f0ff6d48]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.571 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6233563c-0bef-499f-a4cb-716eddf9fe12]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.572 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3d5cb9-2549-49dd-baab-6637a8bd145c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.590 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dc5184-90dd-4d82-8ec5-530b9d06c655]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1377594, 'reachable_time': 29319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274210, 'error': None, 'target': 'ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 systemd[1]: run-netns-ovnmeta\x2d5fff1904\x2d159a\x2d4b76\x2d8c46\x2dfeabf17f29ab.mount: Deactivated successfully.
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.597 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5fff1904-159a-4b76-8c46-feabf17f29ab deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:15:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:15:56.597 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[238af974-edfc-4d2f-a643-0848140145e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:15:56 compute-1 podman[274208]: 2025-09-30 18:15:56.647672929 +0000 UTC m=+0.062091361 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:15:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.782 2 DEBUG nova.virt.libvirt.vif [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:11:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1080774082',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1080774082',id=5,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:12:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddd1f985d8b64b449c79d55b0cbd6422',ramdisk_id='',reservation_id='r-3aq7na6m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-837729328',owner_user_name='tempest-TestExecuteActionsViaActuator-837729328-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:12:16Z,user_data=None,user_id='dc3bb71c425f484fbc46f90978029403',uuid=dadc55d4-1578-4dc1-880a-08098fba63ea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.783 2 DEBUG nova.network.os_vif_util [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converting VIF {"id": "93146cef-46ad-4383-892d-3ec355af507c", "address": "fa:16:3e:0a:61:22", "network": {"id": "5fff1904-159a-4b76-8c46-feabf17f29ab", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-50673167-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "250d452565a2459c8481b499c0227183", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93146cef-46", "ovs_interfaceid": "93146cef-46ad-4383-892d-3ec355af507c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.784 2 DEBUG nova.network.os_vif_util [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.784 2 DEBUG os_vif [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.788 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93146cef-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.814 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=43ccb678-8a03-4af0-bc81-7f1c99284116) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:56 compute-1 nova_compute[238822]: 2025-09-30 18:15:56.819 2 INFO os_vif [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:61:22,bridge_name='br-int',has_traffic_filtering=True,id=93146cef-46ad-4383-892d-3ec355af507c,network=Network(5fff1904-159a-4b76-8c46-feabf17f29ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93146cef-46')
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.117 2 DEBUG nova.compute.manager [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.119 2 DEBUG oslo_concurrency.lockutils [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.119 2 DEBUG oslo_concurrency.lockutils [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.120 2 DEBUG oslo_concurrency.lockutils [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.120 2 DEBUG nova.compute.manager [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] No waiting events found dispatching network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.120 2 DEBUG nova.compute.manager [req-4eaae897-f340-4eb6-9356-c809d62318c9 req-e5461ca7-1c54-4f04-aa42-4edce1e52081 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:57 compute-1 ceph-mon[75484]: pgmap v1119: 353 pgs: 353 active+clean; 202 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.266 2 INFO nova.virt.libvirt.driver [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Deleting instance files /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea_del
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.267 2 INFO nova.virt.libvirt.driver [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Deletion of /var/lib/nova/instances/dadc55d4-1578-4dc1-880a-08098fba63ea_del complete
Sep 30 18:15:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:57.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.782 2 INFO nova.compute.manager [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Took 1.75 seconds to destroy the instance on the hypervisor.
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.782 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.783 2 DEBUG nova.compute.manager [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.783 2 DEBUG nova.network.neutron [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.783 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:57 compute-1 nova_compute[238822]: 2025-09-30 18:15:57.855 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:15:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:15:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:57.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:15:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2568044108' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:15:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2568044108' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:15:58 compute-1 nova_compute[238822]: 2025-09-30 18:15:58.603 2 DEBUG nova.network.neutron [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:15:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:15:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.111 2 INFO nova.compute.manager [-] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Took 1.33 seconds to deallocate network for instance.
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.173 2 DEBUG nova.compute.manager [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.174 2 DEBUG oslo_concurrency.lockutils [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.174 2 DEBUG oslo_concurrency.lockutils [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.175 2 DEBUG oslo_concurrency.lockutils [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.175 2 DEBUG nova.compute.manager [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] No waiting events found dispatching network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.175 2 DEBUG nova.compute.manager [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-unplugged-93146cef-46ad-4383-892d-3ec355af507c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.176 2 DEBUG nova.compute.manager [req-15ee486a-a760-4027-a319-c180320c5141 req-d2842040-8266-4cf8-ba2f-635e68041b0e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: dadc55d4-1578-4dc1-880a-08098fba63ea] Received event network-vif-deleted-93146cef-46ad-4383-892d-3ec355af507c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:15:59 compute-1 ceph-mon[75484]: pgmap v1120: 353 pgs: 353 active+clean; 202 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:15:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:15:59.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.635 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.636 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:15:59 compute-1 nova_compute[238822]: 2025-09-30 18:15:59.678 2 DEBUG oslo_concurrency.processutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:15:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:15:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:15:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:15:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:15:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:15:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:15:59.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:15:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:15:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:16:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441962140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:00 compute-1 nova_compute[238822]: 2025-09-30 18:16:00.215 2 DEBUG oslo_concurrency.processutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:16:00 compute-1 nova_compute[238822]: 2025-09-30 18:16:00.222 2 DEBUG nova.compute.provider_tree [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:16:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3441962140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:00 compute-1 nova_compute[238822]: 2025-09-30 18:16:00.734 2 DEBUG nova.scheduler.client.report [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:16:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:01 compute-1 nova_compute[238822]: 2025-09-30 18:16:01.246 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.610s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:16:01 compute-1 nova_compute[238822]: 2025-09-30 18:16:01.270 2 INFO nova.scheduler.client.report [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Deleted allocations for instance dadc55d4-1578-4dc1-880a-08098fba63ea
Sep 30 18:16:01 compute-1 ceph-mon[75484]: pgmap v1121: 353 pgs: 353 active+clean; 123 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:16:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:01.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:01 compute-1 sshd-session[274274]: Invalid user admin from 192.210.160.141 port 55876
Sep 30 18:16:01 compute-1 sshd-session[274274]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:16:01 compute-1 sshd-session[274274]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:16:01 compute-1 nova_compute[238822]: 2025-09-30 18:16:01.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:01 compute-1 podman[274277]: 2025-09-30 18:16:01.838641027 +0000 UTC m=+0.063967082 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 18:16:01 compute-1 podman[274278]: 2025-09-30 18:16:01.873839049 +0000 UTC m=+0.081394523 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Sep 30 18:16:01 compute-1 podman[274284]: 2025-09-30 18:16:01.897696335 +0000 UTC m=+0.108893288 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:16:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:01.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:02 compute-1 nova_compute[238822]: 2025-09-30 18:16:02.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:02 compute-1 ceph-mon[75484]: pgmap v1122: 353 pgs: 353 active+clean; 123 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:02 compute-1 nova_compute[238822]: 2025-09-30 18:16:02.311 2 DEBUG oslo_concurrency.lockutils [None req-495b4ced-e603-4144-8ebe-6f1e8e4735b9 dc3bb71c425f484fbc46f90978029403 ddd1f985d8b64b449c79d55b0cbd6422 - - default default] Lock "dadc55d4-1578-4dc1-880a-08098fba63ea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.815s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:16:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:03.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:03.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:03 compute-1 sshd-session[274274]: Failed password for invalid user admin from 192.210.160.141 port 55876 ssh2
Sep 30 18:16:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:04 compute-1 sshd-session[274274]: Connection closed by invalid user admin 192.210.160.141 port 55876 [preauth]
Sep 30 18:16:05 compute-1 ceph-mon[75484]: pgmap v1123: 353 pgs: 353 active+clean; 123 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:05.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:05 compute-1 podman[249638]: time="2025-09-30T18:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:16:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:16:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8814 "" "Go-http-client/1.1"
Sep 30 18:16:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:05.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:06 compute-1 nova_compute[238822]: 2025-09-30 18:16:06.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:07 compute-1 nova_compute[238822]: 2025-09-30 18:16:07.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:07 compute-1 ceph-mon[75484]: pgmap v1124: 353 pgs: 353 active+clean; 123 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:07.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:07.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:16:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:09 compute-1 ceph-mon[75484]: pgmap v1125: 353 pgs: 353 active+clean; 123 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2315580826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:09 compute-1 sudo[274348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:16:09 compute-1 sudo[274348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:09 compute-1 sudo[274348]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:09 compute-1 sudo[274373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:16:09 compute-1 sudo[274373]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:09.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:09.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:10 compute-1 sudo[274373]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:16:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:16:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:10 compute-1 sudo[274431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:16:10 compute-1 sudo[274431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:10 compute-1 sudo[274431]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:11 compute-1 ceph-mon[75484]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:16:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:11.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:11 compute-1 nova_compute[238822]: 2025-09-30 18:16:11.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:11.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:12 compute-1 nova_compute[238822]: 2025-09-30 18:16:12.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:12 compute-1 ceph-mon[75484]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:13 compute-1 sshd-session[274458]: Invalid user web from 216.10.242.161 port 54320
Sep 30 18:16:13 compute-1 sshd-session[274458]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:16:13 compute-1 sshd-session[274458]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:16:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:13.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:13.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:15 compute-1 ceph-mon[75484]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:15 compute-1 sshd-session[274458]: Failed password for invalid user web from 216.10.242.161 port 54320 ssh2
Sep 30 18:16:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:15 compute-1 sudo[274463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:16:15 compute-1 sudo[274463]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:15 compute-1 sudo[274463]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:15.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:15 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:16:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:16:16 compute-1 ceph-mon[75484]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:16 compute-1 nova_compute[238822]: 2025-09-30 18:16:16.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:17 compute-1 nova_compute[238822]: 2025-09-30 18:16:17.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:17 compute-1 sshd-session[274458]: Received disconnect from 216.10.242.161 port 54320:11: Bye Bye [preauth]
Sep 30 18:16:17 compute-1 sshd-session[274458]: Disconnected from invalid user web 216.10.242.161 port 54320 [preauth]
Sep 30 18:16:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:17.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:17 compute-1 nova_compute[238822]: 2025-09-30 18:16:17.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:17.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:17 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:19 compute-1 ceph-mon[75484]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: ERROR   18:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: ERROR   18:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: ERROR   18:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: ERROR   18:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: ERROR   18:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:16:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:16:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:19 compute-1 podman[274494]: 2025-09-30 18:16:19.562865416 +0000 UTC m=+0.097520678 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:16:19 compute-1 podman[274493]: 2025-09-30 18:16:19.607788272 +0000 UTC m=+0.138594550 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 18:16:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:19.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:19 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:21 compute-1 ceph-mon[75484]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:16:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:21 compute-1 nova_compute[238822]: 2025-09-30 18:16:21.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:21.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:21 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:22 compute-1 nova_compute[238822]: 2025-09-30 18:16:22.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:22 compute-1 nova_compute[238822]: 2025-09-30 18:16:22.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:22.264 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:16:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:22.265 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:16:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:23 compute-1 ceph-mon[75484]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:16:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:23 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:23.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:24 compute-1 nova_compute[238822]: 2025-09-30 18:16:24.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:25 compute-1 nova_compute[238822]: 2025-09-30 18:16:25.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:25 compute-1 ceph-mon[75484]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:25.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:25 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:25.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:26 compute-1 nova_compute[238822]: 2025-09-30 18:16:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:26 compute-1 nova_compute[238822]: 2025-09-30 18:16:26.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:16:26 compute-1 unix_chkpwd[274553]: password check failed for user (root)
Sep 30 18:16:26 compute-1 sshd-session[274550]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:16:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:26 compute-1 sshd-session[274489]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:16:26 compute-1 sshd-session[274489]: banner exchange: Connection from 110.42.70.108 port 44356: Connection timed out
Sep 30 18:16:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:26 compute-1 nova_compute[238822]: 2025-09-30 18:16:26.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:27 compute-1 nova_compute[238822]: 2025-09-30 18:16:27.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:27.266 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:16:27 compute-1 ceph-mon[75484]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:27.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:27 compute-1 podman[274555]: 2025-09-30 18:16:27.551806949 +0000 UTC m=+0.091467236 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true)
Sep 30 18:16:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:27.904 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:56:60 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd66b07a980744cd29ee547eb08500706', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afc38829-13e1-4bde-91a7-790387f17ce5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=44f8f232-e480-4bd7-ad4d-10a4684c061b) old=Port_Binding(mac=['fa:16:3e:9f:56:60'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd66b07a980744cd29ee547eb08500706', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:16:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:27.905 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 44f8f232-e480-4bd7-ad4d-10a4684c061b in datapath cd077ee2-d26f-4989-8ea7-4aecbac7c636 updated
Sep 30 18:16:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:27.906 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd077ee2-d26f-4989-8ea7-4aecbac7c636, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:16:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:27.909 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9f3b7c-ee66-4f91-9b5b-97232fa20d06]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:16:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:27 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:27.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:28 compute-1 sshd-session[274550]: Failed password for root from 192.210.160.141 port 60040 ssh2
Sep 30 18:16:28 compute-1 ceph-mon[75484]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4200513826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:29.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:16:29 compute-1 nova_compute[238822]: 2025-09-30 18:16:29.580 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:16:29 compute-1 sshd-session[274550]: Connection closed by authenticating user root 192.210.160.141 port 60040 [preauth]
Sep 30 18:16:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:29 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:29.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:16:30 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3412734945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.057 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.307 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.309 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:16:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3412734945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:30 compute-1 ceph-mon[75484]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/261227712' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.364 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.365 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4735MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.365 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:16:30 compute-1 nova_compute[238822]: 2025-09-30 18:16:30.365 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:16:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:30 compute-1 sudo[274605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:16:30 compute-1 sudo[274605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:30 compute-1 sudo[274605]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:31 compute-1 unix_chkpwd[274630]: password check failed for user (root)
Sep 30 18:16:31 compute-1 sshd-session[274599]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.417 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.417 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:16:30 up  3:53,  0 user,  load average: 0.40, 0.84, 1.12\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.431 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:16:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:31.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:16:31 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/919231995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:31 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:31.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.982 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:16:31 compute-1 nova_compute[238822]: 2025-09-30 18:16:31.991 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:16:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/919231995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:32 compute-1 nova_compute[238822]: 2025-09-30 18:16:32.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:32 compute-1 nova_compute[238822]: 2025-09-30 18:16:32.502 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:16:32 compute-1 podman[274655]: 2025-09-30 18:16:32.581342089 +0000 UTC m=+0.113541374 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64)
Sep 30 18:16:32 compute-1 podman[274654]: 2025-09-30 18:16:32.584282488 +0000 UTC m=+0.117224273 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:16:32 compute-1 podman[274656]: 2025-09-30 18:16:32.603513249 +0000 UTC m=+0.128716624 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 18:16:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:33 compute-1 nova_compute[238822]: 2025-09-30 18:16:33.016 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:16:33 compute-1 nova_compute[238822]: 2025-09-30 18:16:33.016 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.651s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:16:33 compute-1 ceph-mon[75484]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:33 compute-1 sshd-session[274599]: Failed password for root from 14.225.167.110 port 54550 ssh2
Sep 30 18:16:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:33.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:33 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:34 compute-1 sshd-session[274599]: Received disconnect from 14.225.167.110 port 54550:11: Bye Bye [preauth]
Sep 30 18:16:34 compute-1 sshd-session[274599]: Disconnected from authenticating user root 14.225.167.110 port 54550 [preauth]
Sep 30 18:16:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:35.075 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:1f:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d4d535c5-48f8-49f7-a49c-3e4867725c0f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4d535c5-48f8-49f7-a49c-3e4867725c0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eddde596e2d64cec889cb4c4d3642bc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b05dbbe6-2588-4c99-ab02-108967f64b97, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fb03bb2d-775c-44cf-a24f-e3e4a7cac8fd) old=Port_Binding(mac=['fa:16:3e:25:1f:d0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d4d535c5-48f8-49f7-a49c-3e4867725c0f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4d535c5-48f8-49f7-a49c-3e4867725c0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eddde596e2d64cec889cb4c4d3642bc5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:16:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:35.076 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fb03bb2d-775c-44cf-a24f-e3e4a7cac8fd in datapath d4d535c5-48f8-49f7-a49c-3e4867725c0f updated
Sep 30 18:16:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:35.078 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4d535c5-48f8-49f7-a49c-3e4867725c0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:16:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:35.079 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb8aa19-7eb4-4f14-a588-7802090c2a79]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:16:35 compute-1 ceph-mon[75484]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:35 compute-1 sshd-session[274713]: Invalid user fermin from 84.51.43.58 port 55409
Sep 30 18:16:35 compute-1 sshd-session[274713]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:16:35 compute-1 sshd-session[274713]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:16:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:35.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:35 compute-1 podman[249638]: time="2025-09-30T18:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:16:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:16:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8831 "" "Go-http-client/1.1"
Sep 30 18:16:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:35 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:35.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:16:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1462816782' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:16:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:16:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1462816782' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:16:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4008f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:36 compute-1 nova_compute[238822]: 2025-09-30 18:16:36.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:37 compute-1 nova_compute[238822]: 2025-09-30 18:16:37.016 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:37 compute-1 nova_compute[238822]: 2025-09-30 18:16:37.017 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:37 compute-1 nova_compute[238822]: 2025-09-30 18:16:37.017 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:37 compute-1 nova_compute[238822]: 2025-09-30 18:16:37.017 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:16:37 compute-1 nova_compute[238822]: 2025-09-30 18:16:37.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:37 compute-1 sshd-session[274713]: Failed password for invalid user fermin from 84.51.43.58 port 55409 ssh2
Sep 30 18:16:37 compute-1 ceph-mon[75484]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1462816782' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:16:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1462816782' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:16:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:37.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:37 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:37.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:16:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:38 compute-1 sshd-session[274713]: Received disconnect from 84.51.43.58 port 55409:11: Bye Bye [preauth]
Sep 30 18:16:38 compute-1 sshd-session[274713]: Disconnected from invalid user fermin 84.51.43.58 port 55409 [preauth]
Sep 30 18:16:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:39 compute-1 ceph-mon[75484]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:39.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:39 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:39.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:40 compute-1 sshd-session[274722]: Invalid user admin from 107.172.146.104 port 60332
Sep 30 18:16:40 compute-1 sshd-session[274722]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:16:40 compute-1 sshd-session[274722]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:16:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:41 compute-1 sshd-session[274603]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:16:41 compute-1 sshd-session[274603]: banner exchange: Connection from 113.249.93.94 port 25502: Connection timed out
Sep 30 18:16:41 compute-1 ceph-mon[75484]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:41.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:41 compute-1 nova_compute[238822]: 2025-09-30 18:16:41.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:41 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:41.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:42 compute-1 nova_compute[238822]: 2025-09-30 18:16:42.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:42 compute-1 sshd-session[274722]: Failed password for invalid user admin from 107.172.146.104 port 60332 ssh2
Sep 30 18:16:42 compute-1 ceph-mon[75484]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.357070) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202357135, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 252, "total_data_size": 5596279, "memory_usage": 5676016, "flush_reason": "Manual Compaction"}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202381428, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3640207, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31321, "largest_seqno": 33674, "table_properties": {"data_size": 3630944, "index_size": 5693, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19917, "raw_average_key_size": 20, "raw_value_size": 3612101, "raw_average_value_size": 3704, "num_data_blocks": 248, "num_entries": 975, "num_filter_entries": 975, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256000, "oldest_key_time": 1759256000, "file_creation_time": 1759256202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 24417 microseconds, and 15409 cpu microseconds.
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.381494) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3640207 bytes OK
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.381522) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.383427) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.383448) EVENT_LOG_v1 {"time_micros": 1759256202383441, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.383473) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5585833, prev total WAL file size 5585833, number of live WAL files 2.
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.385816) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3554KB)], [60(10MB)]
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202385846, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 14657230, "oldest_snapshot_seqno": -1}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5996 keys, 12605969 bytes, temperature: kUnknown
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202443161, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 12605969, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12566200, "index_size": 23659, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15045, "raw_key_size": 153167, "raw_average_key_size": 25, "raw_value_size": 12458319, "raw_average_value_size": 2077, "num_data_blocks": 956, "num_entries": 5996, "num_filter_entries": 5996, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.443590) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 12605969 bytes
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.445425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 255.1 rd, 219.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.5 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(7.5) write-amplify(3.5) OK, records in: 6518, records dropped: 522 output_compression: NoCompression
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.445458) EVENT_LOG_v1 {"time_micros": 1759256202445442, "job": 36, "event": "compaction_finished", "compaction_time_micros": 57451, "compaction_time_cpu_micros": 29824, "output_level": 6, "num_output_files": 1, "total_output_size": 12605969, "num_input_records": 6518, "num_output_records": 5996, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202446771, "job": 36, "event": "table_file_deletion", "file_number": 62}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256202450498, "job": 36, "event": "table_file_deletion", "file_number": 60}
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.385682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.450579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.450589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.450592) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.450594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:16:42.450597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:16:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:43 compute-1 sshd-session[274722]: Received disconnect from 107.172.146.104 port 60332:11: Bye Bye [preauth]
Sep 30 18:16:43 compute-1 sshd-session[274722]: Disconnected from invalid user admin 107.172.146.104 port 60332 [preauth]
Sep 30 18:16:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:43.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:43 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:43.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:44 compute-1 unix_chkpwd[274731]: password check failed for user (root)
Sep 30 18:16:44 compute-1 sshd-session[274727]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:16:45 compute-1 ceph-mon[75484]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:45.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:45 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980048e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:46.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:46 compute-1 nova_compute[238822]: 2025-09-30 18:16:46.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:47 compute-1 nova_compute[238822]: 2025-09-30 18:16:47.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:47 compute-1 ceph-mon[75484]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:47 compute-1 sshd-session[274727]: Failed password for root from 194.107.115.65 port 40602 ssh2
Sep 30 18:16:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:47.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:47 compute-1 sshd-session[274727]: Received disconnect from 194.107.115.65 port 40602:11: Bye Bye [preauth]
Sep 30 18:16:47 compute-1 sshd-session[274727]: Disconnected from authenticating user root 194.107.115.65 port 40602 [preauth]
Sep 30 18:16:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:47 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc0049c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:48.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:49 compute-1 ceph-mon[75484]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/722398944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: ERROR   18:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: ERROR   18:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: ERROR   18:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: ERROR   18:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: ERROR   18:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:16:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:16:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:49.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:49 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe09800c4c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:50.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:50 compute-1 ovn_controller[135204]: 2025-09-30T18:16:50Z|00088|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 18:16:50 compute-1 podman[274742]: 2025-09-30 18:16:50.580294123 +0000 UTC m=+0.114509849 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:16:50 compute-1 podman[274741]: 2025-09-30 18:16:50.603424849 +0000 UTC m=+0.134859340 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 18:16:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:50 compute-1 sudo[274792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:16:51 compute-1 sudo[274792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:16:51 compute-1 sudo[274792]: pam_unix(sudo:session): session closed for user root
Sep 30 18:16:51 compute-1 sshd-session[274736]: Invalid user developer from 175.126.165.170 port 51486
Sep 30 18:16:51 compute-1 sshd-session[274736]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:16:51 compute-1 sshd-session[274736]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:16:51 compute-1 ceph-mon[75484]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:16:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:51.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:51 compute-1 nova_compute[238822]: 2025-09-30 18:16:51.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:51 compute-1 unix_chkpwd[274817]: password check failed for user (root)
Sep 30 18:16:51 compute-1 sshd-session[274740]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:16:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:51 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:52.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:52 compute-1 nova_compute[238822]: 2025-09-30 18:16:52.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:52 compute-1 ceph-mon[75484]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 180 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:16:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:16:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:53 compute-1 sshd-session[274736]: Failed password for invalid user developer from 175.126.165.170 port 51486 ssh2
Sep 30 18:16:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:16:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:53.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:16:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:53 compute-1 sshd-session[274740]: Failed password for root from 192.210.160.141 port 51964 ssh2
Sep 30 18:16:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:53 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:54.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:54 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:16:54 compute-1 ceph-mon[75484]: pgmap v1148: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:16:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:54.365 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:16:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:54.365 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:16:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:16:54.366 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:16:54 compute-1 sshd-session[274736]: Received disconnect from 175.126.165.170 port 51486:11: Bye Bye [preauth]
Sep 30 18:16:54 compute-1 sshd-session[274736]: Disconnected from invalid user developer 175.126.165.170 port 51486 [preauth]
Sep 30 18:16:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:55 compute-1 sshd-session[274740]: Connection closed by authenticating user root 192.210.160.141 port 51964 [preauth]
Sep 30 18:16:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:55.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:55 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:56.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:56 compute-1 nova_compute[238822]: 2025-09-30 18:16:56.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:57 compute-1 nova_compute[238822]: 2025-09-30 18:16:57.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:16:57 compute-1 ceph-mon[75484]: pgmap v1149: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:16:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2973524139' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:16:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:57.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:57 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:16:58.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/118095084' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:16:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/118095084' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:16:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/800863491' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:16:58 compute-1 podman[274828]: 2025-09-30 18:16:58.554261109 +0000 UTC m=+0.085732250 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:16:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:16:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:16:59 compute-1 ceph-mon[75484]: pgmap v1150: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:16:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:16:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:16:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:16:59.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:16:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:16:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:59 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:16:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:16:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:16:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:16:59 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:00.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:00 compute-1 ceph-mon[75484]: pgmap v1151: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:17:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:01 compute-1 nova_compute[238822]: 2025-09-30 18:17:01.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:01 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:02 compute-1 nova_compute[238822]: 2025-09-30 18:17:02.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:02.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:03 compute-1 ceph-mon[75484]: pgmap v1152: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:17:03 compute-1 podman[274854]: 2025-09-30 18:17:03.547209579 +0000 UTC m=+0.087227281 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:17:03 compute-1 podman[274852]: 2025-09-30 18:17:03.552128562 +0000 UTC m=+0.096835951 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:17:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:03.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:03 compute-1 podman[274853]: 2025-09-30 18:17:03.592916726 +0000 UTC m=+0.128098857 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git)
Sep 30 18:17:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:03 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:04.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:05 compute-1 ceph-mon[75484]: pgmap v1153: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:17:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:05.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:05 compute-1 podman[249638]: time="2025-09-30T18:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:17:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:17:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8822 "" "Go-http-client/1.1"
Sep 30 18:17:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:05 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:06.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:06 compute-1 nova_compute[238822]: 2025-09-30 18:17:06.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:07 compute-1 nova_compute[238822]: 2025-09-30 18:17:07.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:07 compute-1 ceph-mon[75484]: pgmap v1154: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 7.6 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:17:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:07.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:07 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:08.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:17:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:09 compute-1 ceph-mon[75484]: pgmap v1155: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 7.6 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:17:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:09.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:09 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:10.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:10 compute-1 nova_compute[238822]: 2025-09-30 18:17:10.204 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:10 compute-1 nova_compute[238822]: 2025-09-30 18:17:10.205 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:10 compute-1 ceph-mon[75484]: pgmap v1156: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:17:10 compute-1 nova_compute[238822]: 2025-09-30 18:17:10.711 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:17:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:11 compute-1 sudo[274920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:17:11 compute-1 sudo[274920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:11 compute-1 sudo[274920]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:11 compute-1 nova_compute[238822]: 2025-09-30 18:17:11.267 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:11 compute-1 nova_compute[238822]: 2025-09-30 18:17:11.267 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:11 compute-1 nova_compute[238822]: 2025-09-30 18:17:11.278 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:17:11 compute-1 nova_compute[238822]: 2025-09-30 18:17:11.279 2 INFO nova.compute.claims [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:17:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:11.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:11 compute-1 nova_compute[238822]: 2025-09-30 18:17:11.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:11 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:12 compute-1 nova_compute[238822]: 2025-09-30 18:17:12.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:12 compute-1 nova_compute[238822]: 2025-09-30 18:17:12.340 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:17:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2769658347' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:12 compute-1 nova_compute[238822]: 2025-09-30 18:17:12.825 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:12 compute-1 nova_compute[238822]: 2025-09-30 18:17:12.832 2 DEBUG nova.compute.provider_tree [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:17:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:13 compute-1 ceph-mon[75484]: pgmap v1157: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:17:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2769658347' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:13 compute-1 nova_compute[238822]: 2025-09-30 18:17:13.351 2 DEBUG nova.scheduler.client.report [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:17:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:13.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:13 compute-1 nova_compute[238822]: 2025-09-30 18:17:13.873 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.606s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:13 compute-1 nova_compute[238822]: 2025-09-30 18:17:13.874 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:17:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:13 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:14.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:14 compute-1 nova_compute[238822]: 2025-09-30 18:17:14.387 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:17:14 compute-1 nova_compute[238822]: 2025-09-30 18:17:14.389 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:17:14 compute-1 nova_compute[238822]: 2025-09-30 18:17:14.390 2 WARNING neutronclient.v2_0.client [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:14 compute-1 nova_compute[238822]: 2025-09-30 18:17:14.391 2 WARNING neutronclient.v2_0.client [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400ae90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:14 compute-1 nova_compute[238822]: 2025-09-30 18:17:14.906 2 INFO nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:17:15 compute-1 ceph-mon[75484]: pgmap v1158: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:17:15 compute-1 nova_compute[238822]: 2025-09-30 18:17:15.412 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:17:15 compute-1 sshd-session[274970]: Invalid user elena from 216.10.242.161 port 52856
Sep 30 18:17:15 compute-1 sshd-session[274970]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:17:15 compute-1 sshd-session[274970]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:17:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:15.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:15 compute-1 sudo[274975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:17:15 compute-1 sudo[274975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:15 compute-1 sudo[274975]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:15 compute-1 sudo[275000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:17:15 compute-1 sudo[275000]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:16.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.129 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Successfully created port: 57967583-5fed-40cd-bc09-455163ece536 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.446 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.449 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.450 2 INFO nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Creating image(s)
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.499 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.540 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.579 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.588 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:16 compute-1 sudo[275000]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.680 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.682 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.683 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.683 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.727 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.734 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 7f660b4a-3177-4f85-985d-90a46be506e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:16 compute-1 nova_compute[238822]: 2025-09-30 18:17:16.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:17 compute-1 sshd-session[274973]: Invalid user oracle from 192.210.160.141 port 59668
Sep 30 18:17:17 compute-1 sshd-session[274973]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:17:17 compute-1 sshd-session[274973]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.119 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 7f660b4a-3177-4f85-985d-90a46be506e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:17 compute-1 sshd-session[274970]: Failed password for invalid user elena from 216.10.242.161 port 52856 ssh2
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.205 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] resizing rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:17:17 compute-1 ceph-mon[75484]: pgmap v1159: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 64 op/s
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:17:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.284 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Successfully updated port: 57967583-5fed-40cd-bc09-455163ece536 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.338 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.339 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquired lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.339 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.347 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.348 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Ensure instance console log exists: /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.349 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.349 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.350 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.382 2 DEBUG nova.compute.manager [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-changed-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.382 2 DEBUG nova.compute.manager [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Refreshing instance network info cache due to event network-changed-57967583-5fed-40cd-bc09-455163ece536. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:17:17 compute-1 nova_compute[238822]: 2025-09-30 18:17:17.383 2 DEBUG oslo_concurrency.lockutils [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:17:17 compute-1 sshd-session[274970]: Received disconnect from 216.10.242.161 port 52856:11: Bye Bye [preauth]
Sep 30 18:17:17 compute-1 sshd-session[274970]: Disconnected from invalid user elena 216.10.242.161 port 52856 [preauth]
Sep 30 18:17:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:17:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:17.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:17:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:18 compute-1 nova_compute[238822]: 2025-09-30 18:17:18.000 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:17:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:18.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:18 compute-1 nova_compute[238822]: 2025-09-30 18:17:18.273 2 WARNING neutronclient.v2_0.client [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:18 compute-1 nova_compute[238822]: 2025-09-30 18:17:18.687 2 DEBUG nova.network.neutron [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updating instance_info_cache with network_info: [{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:17:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:18 compute-1 unix_chkpwd[275228]: password check failed for user (root)
Sep 30 18:17:18 compute-1 sshd-session[275224]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:17:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400aeb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.194 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Releasing lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.194 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Instance network_info: |[{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.195 2 DEBUG oslo_concurrency.lockutils [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.195 2 DEBUG nova.network.neutron [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Refreshing network info cache for port 57967583-5fed-40cd-bc09-455163ece536 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.198 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Start _get_guest_xml network_info=[{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.203 2 WARNING nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.204 2 DEBUG nova.virt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1361986283', uuid='7f660b4a-3177-4f85-985d-90a46be506e6'), owner=OwnerMeta(userid='54973270e5a040c8af5ec2225e3caec8', username='tempest-TestExecuteBasicStrategy-1755756413-project-admin', projectid='eddde596e2d64cec889cb4c4d3642bc5', projectname='tempest-TestExecuteBasicStrategy-1755756413'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256239.20422) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.210 2 DEBUG nova.virt.libvirt.host [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.210 2 DEBUG nova.virt.libvirt.host [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.213 2 DEBUG nova.virt.libvirt.host [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.214 2 DEBUG nova.virt.libvirt.host [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.214 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.214 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.215 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.215 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.215 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.215 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.216 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.216 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.216 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.216 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.217 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.217 2 DEBUG nova.virt.hardware [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.220 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:19 compute-1 ceph-mon[75484]: pgmap v1160: 353 pgs: 353 active+clean; 88 MiB data, 201 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 64 op/s
Sep 30 18:17:19 compute-1 sshd-session[274973]: Failed password for invalid user oracle from 192.210.160.141 port 59668 ssh2
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: ERROR   18:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: ERROR   18:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: ERROR   18:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: ERROR   18:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: ERROR   18:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:17:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:17:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:19.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:17:19 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3550832362' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.703 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.741 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.746 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:19 compute-1 nova_compute[238822]: 2025-09-30 18:17:19.760 2 WARNING neutronclient.v2_0.client [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:20.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:20 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:17:20 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/318708065' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.225 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.227 2 DEBUG nova.virt.libvirt.vif [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1361986283',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1361986283',id=11,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eddde596e2d64cec889cb4c4d3642bc5',ramdisk_id='',reservation_id='r-reelvu4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1755756413',owner_user_name='tempest-TestExecuteBasicStrategy-1755756413-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:17:15Z,user_data=None,user_id='54973270e5a040c8af5ec2225e3caec8',uuid=7f660b4a-3177-4f85-985d-90a46be506e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.227 2 DEBUG nova.network.os_vif_util [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Converting VIF {"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.229 2 DEBUG nova.network.os_vif_util [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.230 2 DEBUG nova.objects.instance [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f660b4a-3177-4f85-985d-90a46be506e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.255 2 WARNING neutronclient.v2_0.client [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3550832362' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:17:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/318708065' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.424 2 DEBUG nova.network.neutron [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updated VIF entry in instance network info cache for port 57967583-5fed-40cd-bc09-455163ece536. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.425 2 DEBUG nova.network.neutron [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updating instance_info_cache with network_info: [{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:17:20 compute-1 sshd-session[274973]: Connection closed by invalid user oracle 192.210.160.141 port 59668 [preauth]
Sep 30 18:17:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.742 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <uuid>7f660b4a-3177-4f85-985d-90a46be506e6</uuid>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <name>instance-0000000b</name>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1361986283</nova:name>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:17:19</nova:creationTime>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:17:20 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:17:20 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:user uuid="54973270e5a040c8af5ec2225e3caec8">tempest-TestExecuteBasicStrategy-1755756413-project-admin</nova:user>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:project uuid="eddde596e2d64cec889cb4c4d3642bc5">tempest-TestExecuteBasicStrategy-1755756413</nova:project>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <nova:port uuid="57967583-5fed-40cd-bc09-455163ece536">
Sep 30 18:17:20 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <system>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="serial">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="uuid">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </system>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <os>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </os>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <features>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </features>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </source>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk.config">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </source>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:17:20 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:1d:cc:b5"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <target dev="tap57967583-5f"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <video>
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </video>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:17:20 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:17:20 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:17:20 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:17:20 compute-1 nova_compute[238822]: </domain>
Sep 30 18:17:20 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.743 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Preparing to wait for external event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.744 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.744 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.744 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.745 2 DEBUG nova.virt.libvirt.vif [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1361986283',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1361986283',id=11,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eddde596e2d64cec889cb4c4d3642bc5',ramdisk_id='',reservation_id='r-reelvu4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1755756413',owner_user_name='tempest-TestExecuteBasicStrategy-1755756413-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:17:15Z,user_data=None,user_id='54973270e5a040c8af5ec2225e3caec8',uuid=7f660b4a-3177-4f85-985d-90a46be506e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.745 2 DEBUG nova.network.os_vif_util [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Converting VIF {"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.746 2 DEBUG nova.network.os_vif_util [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.746 2 DEBUG os_vif [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.749 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd75364db-889d-5cdf-a369-e34f5dc79543', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57967583-5f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap57967583-5f, col_values=(('qos', UUID('e34b8e02-2189-48d8-bcda-8092db881483')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap57967583-5f, col_values=(('external_ids', {'iface-id': '57967583-5fed-40cd-bc09-455163ece536', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:cc:b5', 'vm-uuid': '7f660b4a-3177-4f85-985d-90a46be506e6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 NetworkManager[45549]: <info>  [1759256240.7581] manager: (tap57967583-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:17:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.766 2 INFO os_vif [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f')
Sep 30 18:17:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_40] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b80029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:20 compute-1 sshd-session[275224]: Failed password for root from 103.153.190.105 port 41550 ssh2
Sep 30 18:17:20 compute-1 nova_compute[238822]: 2025-09-30 18:17:20.934 2 DEBUG oslo_concurrency.lockutils [req-b06db668-a452-4305-a190-1551196aa4ef req-3f34df14-aab7-469d-932b-47721b8aa676 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:17:21 compute-1 ceph-mon[75484]: pgmap v1161: 353 pgs: 353 active+clean; 167 MiB data, 248 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 154 op/s
Sep 30 18:17:21 compute-1 podman[275296]: 2025-09-30 18:17:21.549627657 +0000 UTC m=+0.076904222 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:17:21 compute-1 podman[275295]: 2025-09-30 18:17:21.580871132 +0000 UTC m=+0.116362519 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:17:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:21.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:21 compute-1 sshd-session[275224]: Received disconnect from 103.153.190.105 port 41550:11: Bye Bye [preauth]
Sep 30 18:17:21 compute-1 sshd-session[275224]: Disconnected from authenticating user root 103.153.190.105 port 41550 [preauth]
Sep 30 18:17:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:22.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.318 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.319 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.319 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] No VIF found with MAC fa:16:3e:1d:cc:b5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.319 2 INFO nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Using config drive
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.344 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:22 compute-1 sudo[275369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:17:22 compute-1 sudo[275369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:22 compute-1 sudo[275369]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400aed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:22 compute-1 nova_compute[238822]: 2025-09-30 18:17:22.860 2 WARNING neutronclient.v2_0.client [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.061 2 INFO nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Creating config drive at /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.071 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp2vc6slpl execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:23 compute-1 ceph-mon[75484]: pgmap v1162: 353 pgs: 353 active+clean; 167 MiB data, 248 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:17:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:17:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:17:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.213 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp2vc6slpl" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.261 2 DEBUG nova.storage.rbd_utils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] rbd image 7f660b4a-3177-4f85-985d-90a46be506e6_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.267 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config 7f660b4a-3177-4f85-985d-90a46be506e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.469 2 DEBUG oslo_concurrency.processutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config 7f660b4a-3177-4f85-985d-90a46be506e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.201s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.470 2 INFO nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Deleting local config drive /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/disk.config because it was imported into RBD.
Sep 30 18:17:23 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:17:23 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:17:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:23 compute-1 kernel: tap57967583-5f: entered promiscuous mode
Sep 30 18:17:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000055s ======
Sep 30 18:17:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:23.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Sep 30 18:17:23 compute-1 NetworkManager[45549]: <info>  [1759256243.6099] manager: (tap57967583-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Sep 30 18:17:23 compute-1 ovn_controller[135204]: 2025-09-30T18:17:23Z|00089|binding|INFO|Claiming lport 57967583-5fed-40cd-bc09-455163ece536 for this chassis.
Sep 30 18:17:23 compute-1 ovn_controller[135204]: 2025-09-30T18:17:23Z|00090|binding|INFO|57967583-5fed-40cd-bc09-455163ece536: Claiming fa:16:3e:1d:cc:b5 10.100.0.6
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:23 compute-1 systemd-udevd[275462]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:17:23 compute-1 NetworkManager[45549]: <info>  [1759256243.6912] device (tap57967583-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:17:23 compute-1 NetworkManager[45549]: <info>  [1759256243.6932] device (tap57967583-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:17:23 compute-1 ovn_controller[135204]: 2025-09-30T18:17:23Z|00091|binding|INFO|Setting lport 57967583-5fed-40cd-bc09-455163ece536 ovn-installed in OVS
Sep 30 18:17:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:23 compute-1 ovn_controller[135204]: 2025-09-30T18:17:23Z|00092|binding|INFO|Setting lport 57967583-5fed-40cd-bc09-455163ece536 up in Southbound
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.749 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:cc:b5 10.100.0.6'], port_security=['fa:16:3e:1d:cc:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7f660b4a-3177-4f85-985d-90a46be506e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eddde596e2d64cec889cb4c4d3642bc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f566abc7-3fe4-4e56-86df-377c1571ec04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afc38829-13e1-4bde-91a7-790387f17ce5, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=57967583-5fed-40cd-bc09-455163ece536) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.751 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 57967583-5fed-40cd-bc09-455163ece536 in datapath cd077ee2-d26f-4989-8ea7-4aecbac7c636 bound to our chassis
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.753 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd077ee2-d26f-4989-8ea7-4aecbac7c636
Sep 30 18:17:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:23 compute-1 systemd-machined[195911]: New machine qemu-7-instance-0000000b.
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.779 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[250495db-8d5b-47fc-9c75-6f204eff9bd5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.780 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd077ee2-d1 in ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:17:23 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.783 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd077ee2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.783 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b53844e4-e599-431c-a9a3-023461889515]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.784 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfe3674-397f-4eb8-b8bf-4ec20d89a688]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.803 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[d4905d51-df9e-43b4-b3f4-9c8858776da9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.823 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[89d55663-7072-4cf3-acb0-95b197cef232]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.869 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c9dd82af-ef0d-4862-a73a-fa779e4b2202]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 NetworkManager[45549]: <info>  [1759256243.8824] manager: (tapcd077ee2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.881 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7c87189a-46a4-49f6-9a67-fac183422dc3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.926 2 DEBUG nova.compute.manager [req-1300c99c-f908-49aa-8668-eb336e00d8d6 req-74d8645e-e3c6-4801-8093-d145498dc9a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.927 2 DEBUG oslo_concurrency.lockutils [req-1300c99c-f908-49aa-8668-eb336e00d8d6 req-74d8645e-e3c6-4801-8093-d145498dc9a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.927 2 DEBUG oslo_concurrency.lockutils [req-1300c99c-f908-49aa-8668-eb336e00d8d6 req-74d8645e-e3c6-4801-8093-d145498dc9a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.928 2 DEBUG oslo_concurrency.lockutils [req-1300c99c-f908-49aa-8668-eb336e00d8d6 req-74d8645e-e3c6-4801-8093-d145498dc9a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:23 compute-1 nova_compute[238822]: 2025-09-30 18:17:23.928 2 DEBUG nova.compute.manager [req-1300c99c-f908-49aa-8668-eb336e00d8d6 req-74d8645e-e3c6-4801-8093-d145498dc9a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Processing event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.943 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4183e6-9577-4e1a-a8a2-856b889a6f92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.948 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2283f3-e376-47af-bf02-1c6a9e52c991]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:23 compute-1 NetworkManager[45549]: <info>  [1759256243.9759] device (tapcd077ee2-d0): carrier: link connected
Sep 30 18:17:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:23.981 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[051df338-1586-437c-8c39-beb2c8e5b1d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.008 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.011 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2fce0aad-5948-4395-a9f5-6a27235bb98b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd077ee2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:56:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1408555, 'reachable_time': 30934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275498, 'error': None, 'target': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.037 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c813b02a-b56f-4f91-8cd9-1b2eda0e1c61]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:5660'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1408555, 'tstamp': 1408555}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275500, 'error': None, 'target': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:24.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.062 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e8be0d30-1ae6-4825-8a1f-34c76d111772]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd077ee2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:56:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1408555, 'reachable_time': 30934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275502, 'error': None, 'target': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.121 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c5aad223-a76a-497b-a005-0bd3a614c50e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.211 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6e3d15-60ea-42fa-80a5-01909a04b686]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.213 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd077ee2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.213 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.214 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd077ee2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 NetworkManager[45549]: <info>  [1759256244.2182] manager: (tapcd077ee2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Sep 30 18:17:24 compute-1 kernel: tapcd077ee2-d0: entered promiscuous mode
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.224 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd077ee2-d0, col_values=(('external_ids', {'iface-id': '44f8f232-e480-4bd7-ad4d-10a4684c061b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 ovn_controller[135204]: 2025-09-30T18:17:24Z|00093|binding|INFO|Releasing lport 44f8f232-e480-4bd7-ad4d-10a4684c061b from this chassis (sb_readonly=0)
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.259 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[50d6bb11-cfe3-47af-b734-8308d0abc526]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.260 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.260 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.260 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for cd077ee2-d26f-4989-8ea7-4aecbac7c636 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.261 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.261 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1847fc-7cd7-4544-9375-79648b0d172f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.262 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.262 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6d786f76-5699-468a-913a-622801798c41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.263 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-cd077ee2-d26f-4989-8ea7-4aecbac7c636
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID cd077ee2-d26f-4989-8ea7-4aecbac7c636
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:17:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:24.264 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'env', 'PROCESS_TAG=haproxy-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd077ee2-d26f-4989-8ea7-4aecbac7c636.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:17:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:24 compute-1 podman[275576]: 2025-09-30 18:17:24.790126515 +0000 UTC m=+0.084540109 container create 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:17:24 compute-1 podman[275576]: 2025-09-30 18:17:24.74228515 +0000 UTC m=+0.036698804 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:17:24 compute-1 systemd[1]: Started libpod-conmon-6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c.scope.
Sep 30 18:17:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:24 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:17:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/257e53896924943172c009234c41b7c62fc8771abba51e56de9cca94c25f99b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.926 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.931 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:17:24 compute-1 podman[275576]: 2025-09-30 18:17:24.932105847 +0000 UTC m=+0.226519421 container init 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.935 2 INFO nova.virt.libvirt.driver [-] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Instance spawned successfully.
Sep 30 18:17:24 compute-1 nova_compute[238822]: 2025-09-30 18:17:24.936 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:17:24 compute-1 podman[275576]: 2025-09-30 18:17:24.943393563 +0000 UTC m=+0.237807117 container start 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:17:24 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [NOTICE]   (275594) : New worker (275596) forked
Sep 30 18:17:24 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [NOTICE]   (275594) : Loading success.
Sep 30 18:17:25 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:25.020 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:17:25 compute-1 ceph-mon[75484]: pgmap v1163: 353 pgs: 353 active+clean; 167 MiB data, 248 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.455 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.457 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.457 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.458 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.460 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.460 2 DEBUG nova.virt.libvirt.driver [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:17:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:25.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.972 2 INFO nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Took 9.52 seconds to spawn the instance on the hypervisor.
Sep 30 18:17:25 compute-1 nova_compute[238822]: 2025-09-30 18:17:25.973 2 DEBUG nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:17:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.014 2 DEBUG nova.compute.manager [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.014 2 DEBUG oslo_concurrency.lockutils [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.015 2 DEBUG oslo_concurrency.lockutils [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.015 2 DEBUG oslo_concurrency.lockutils [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.015 2 DEBUG nova.compute.manager [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.016 2 WARNING nova.compute.manager [req-256df294-5f53-45d9-9575-18fd00c0aea0 req-828c0a6a-5448-4748-94a8-974a466415dd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received unexpected event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 for instance with vm_state building and task_state spawning.
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:17:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:26 compute-1 nova_compute[238822]: 2025-09-30 18:17:26.523 2 INFO nova.compute.manager [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Took 15.30 seconds to build instance.
Sep 30 18:17:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:26 compute-1 ceph-mon[75484]: pgmap v1164: 353 pgs: 353 active+clean; 167 MiB data, 248 MiB used, 40 GiB / 40 GiB avail; 341 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:17:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:27.021 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:17:27 compute-1 nova_compute[238822]: 2025-09-30 18:17:27.030 2 DEBUG oslo_concurrency.lockutils [None req-1d6fb9d8-221e-4b84-98e6-d03b788ecdf5 54973270e5a040c8af5ec2225e3caec8 eddde596e2d64cec889cb4c4d3642bc5 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.825s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:27 compute-1 nova_compute[238822]: 2025-09-30 18:17:27.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:27 compute-1 nova_compute[238822]: 2025-09-30 18:17:27.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:27 compute-1 unix_chkpwd[275610]: password check failed for user (root)
Sep 30 18:17:27 compute-1 sshd-session[275607]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 18:17:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:27.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0980010b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:28.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:28 compute-1 sshd-session[275607]: Failed password for root from 185.156.73.233 port 52890 ssh2
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:29 compute-1 ceph-mon[75484]: pgmap v1165: 353 pgs: 353 active+clean; 167 MiB data, 248 MiB used, 40 GiB / 40 GiB avail; 341 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:17:29 compute-1 podman[275613]: 2025-09-30 18:17:29.546540655 +0000 UTC m=+0.083506981 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.568 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.569 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:17:29 compute-1 nova_compute[238822]: 2025-09-30 18:17:29.570 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:29.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:17:30 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1931493676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:30.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:30 compute-1 nova_compute[238822]: 2025-09-30 18:17:30.074 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:30 compute-1 sshd-session[275607]: Connection closed by authenticating user root 185.156.73.233 port 52890 [preauth]
Sep 30 18:17:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3474917705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1931493676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:30 compute-1 nova_compute[238822]: 2025-09-30 18:17:30.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.160 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.161 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:17:31 compute-1 sudo[275658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:17:31 compute-1 sudo[275658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:31 compute-1 sudo[275658]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:31 compute-1 ceph-mon[75484]: pgmap v1166: 353 pgs: 353 active+clean; 167 MiB data, 266 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 164 op/s
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.396 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.397 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.427 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.428 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4534MB free_disk=39.92593765258789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.429 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:31 compute-1 nova_compute[238822]: 2025-09-30 18:17:31.429 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:31.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:32 compute-1 nova_compute[238822]: 2025-09-30 18:17:32.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:32.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:32 compute-1 nova_compute[238822]: 2025-09-30 18:17:32.492 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 7f660b4a-3177-4f85-985d-90a46be506e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:17:32 compute-1 nova_compute[238822]: 2025-09-30 18:17:32.493 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:17:32 compute-1 nova_compute[238822]: 2025-09-30 18:17:32.493 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:17:31 up  3:54,  0 user,  load average: 0.64, 0.80, 1.08\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_eddde596e2d64cec889cb4c4d3642bc5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:17:32 compute-1 nova_compute[238822]: 2025-09-30 18:17:32.547 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:17:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:17:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1496519701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:33 compute-1 nova_compute[238822]: 2025-09-30 18:17:33.060 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:17:33 compute-1 nova_compute[238822]: 2025-09-30 18:17:33.067 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:17:33 compute-1 ceph-mon[75484]: pgmap v1167: 353 pgs: 353 active+clean; 167 MiB data, 266 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 74 op/s
Sep 30 18:17:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/639538289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1496519701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:17:33 compute-1 nova_compute[238822]: 2025-09-30 18:17:33.578 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:17:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:33.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:17:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:34.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:17:34 compute-1 nova_compute[238822]: 2025-09-30 18:17:34.090 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:17:34 compute-1 nova_compute[238822]: 2025-09-30 18:17:34.090 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.661s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:34 compute-1 ceph-mon[75484]: pgmap v1168: 353 pgs: 353 active+clean; 167 MiB data, 266 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 74 op/s
Sep 30 18:17:34 compute-1 podman[275711]: 2025-09-30 18:17:34.570234447 +0000 UTC m=+0.103386229 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 18:17:34 compute-1 podman[275709]: 2025-09-30 18:17:34.599723515 +0000 UTC m=+0.137395199 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid)
Sep 30 18:17:34 compute-1 podman[275710]: 2025-09-30 18:17:34.609790947 +0000 UTC m=+0.142772164 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Sep 30 18:17:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:35.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:35 compute-1 podman[249638]: time="2025-09-30T18:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:17:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39666 "" "Go-http-client/1.1"
Sep 30 18:17:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9292 "" "Go-http-client/1.1"
Sep 30 18:17:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:35 compute-1 nova_compute[238822]: 2025-09-30 18:17:35.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:36.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:37 compute-1 nova_compute[238822]: 2025-09-30 18:17:37.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:37 compute-1 ovn_controller[135204]: 2025-09-30T18:17:37Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1d:cc:b5 10.100.0.6
Sep 30 18:17:37 compute-1 ovn_controller[135204]: 2025-09-30T18:17:37Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:cc:b5 10.100.0.6
Sep 30 18:17:37 compute-1 ceph-mon[75484]: pgmap v1169: 353 pgs: 353 active+clean; 167 MiB data, 266 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 74 op/s
Sep 30 18:17:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3679108769' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:17:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3679108769' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:17:37 compute-1 unix_chkpwd[275769]: password check failed for user (root)
Sep 30 18:17:37 compute-1 sshd-session[275767]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:17:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:37.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:37 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Sep 30 18:17:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:38.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:38 compute-1 nova_compute[238822]: 2025-09-30 18:17:38.087 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:38 compute-1 nova_compute[238822]: 2025-09-30 18:17:38.087 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:17:38 compute-1 nova_compute[238822]: 2025-09-30 18:17:38.599 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:38 compute-1 nova_compute[238822]: 2025-09-30 18:17:38.600 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:38 compute-1 nova_compute[238822]: 2025-09-30 18:17:38.600 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:17:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:39 compute-1 ceph-mon[75484]: pgmap v1170: 353 pgs: 353 active+clean; 167 MiB data, 266 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 74 op/s
Sep 30 18:17:39 compute-1 sshd-session[275767]: Failed password for root from 107.172.146.104 port 51674 ssh2
Sep 30 18:17:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:39.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:40.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:40 compute-1 sshd-session[275767]: Received disconnect from 107.172.146.104 port 51674:11: Bye Bye [preauth]
Sep 30 18:17:40 compute-1 sshd-session[275767]: Disconnected from authenticating user root 107.172.146.104 port 51674 [preauth]
Sep 30 18:17:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:40 compute-1 nova_compute[238822]: 2025-09-30 18:17:40.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:41 compute-1 sshd-session[275776]: Invalid user habib from 167.172.43.167 port 51990
Sep 30 18:17:41 compute-1 sshd-session[275776]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:17:41 compute-1 sshd-session[275776]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:17:41 compute-1 ceph-mon[75484]: pgmap v1171: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 138 op/s
Sep 30 18:17:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:41.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:42 compute-1 nova_compute[238822]: 2025-09-30 18:17:42.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:42.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:42 compute-1 unix_chkpwd[275780]: password check failed for user (root)
Sep 30 18:17:42 compute-1 sshd-session[275773]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:17:42 compute-1 ceph-mon[75484]: pgmap v1172: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:17:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:43 compute-1 sshd-session[275776]: Failed password for invalid user habib from 167.172.43.167 port 51990 ssh2
Sep 30 18:17:43 compute-1 sshd-session[275776]: Received disconnect from 167.172.43.167 port 51990:11: Bye Bye [preauth]
Sep 30 18:17:43 compute-1 sshd-session[275776]: Disconnected from invalid user habib 167.172.43.167 port 51990 [preauth]
Sep 30 18:17:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:43.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:43 compute-1 sshd-session[275773]: Failed password for root from 192.210.160.141 port 53224 ssh2
Sep 30 18:17:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:44.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:45 compute-1 ceph-mon[75484]: pgmap v1173: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:17:45 compute-1 sshd-session[275773]: Connection closed by authenticating user root 192.210.160.141 port 53224 [preauth]
Sep 30 18:17:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:45.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:45 compute-1 nova_compute[238822]: 2025-09-30 18:17:45.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:46.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:47 compute-1 nova_compute[238822]: 2025-09-30 18:17:47.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:47 compute-1 ceph-mon[75484]: pgmap v1174: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:17:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:17:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:47.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:17:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:48.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:48 compute-1 sshd-session[275350]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 18:17:48 compute-1 sshd-session[275350]: Connection reset by 45.140.17.97 port 50907
Sep 30 18:17:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:49 compute-1 ceph-mon[75484]: pgmap v1175: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: ERROR   18:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: ERROR   18:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: ERROR   18:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: ERROR   18:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: ERROR   18:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:17:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:17:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:49.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:50.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:50 compute-1 nova_compute[238822]: 2025-09-30 18:17:50.255 2 DEBUG nova.compute.manager [None req-715e7917-88d0-4647-8289-b7b6ca3ad93e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 18:17:50 compute-1 nova_compute[238822]: 2025-09-30 18:17:50.304 2 DEBUG nova.compute.provider_tree [None req-715e7917-88d0-4647-8289-b7b6ca3ad93e 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 9 to 11 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:17:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:50 compute-1 nova_compute[238822]: 2025-09-30 18:17:50.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:50 compute-1 unix_chkpwd[275794]: password check failed for user (root)
Sep 30 18:17:50 compute-1 sshd-session[275788]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:17:51 compute-1 ceph-mon[75484]: pgmap v1176: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:17:51 compute-1 sshd-session[275791]: Invalid user dss from 194.107.115.65 port 65070
Sep 30 18:17:51 compute-1 sshd-session[275791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:17:51 compute-1 sshd-session[275791]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:17:51 compute-1 sudo[275795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:17:51 compute-1 sudo[275795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:17:51 compute-1 sudo[275795]: pam_unix(sudo:session): session closed for user root
Sep 30 18:17:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:51.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:52 compute-1 nova_compute[238822]: 2025-09-30 18:17:52.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:17:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:52.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:17:52 compute-1 ceph-mon[75484]: pgmap v1177: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:17:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:17:52 compute-1 sshd-session[275788]: Failed password for root from 84.51.43.58 port 40453 ssh2
Sep 30 18:17:52 compute-1 podman[275822]: 2025-09-30 18:17:52.549805616 +0000 UTC m=+0.082983077 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:17:52 compute-1 podman[275821]: 2025-09-30 18:17:52.578251666 +0000 UTC m=+0.123597216 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:17:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8004bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:53 compute-1 sshd-session[275791]: Failed password for invalid user dss from 194.107.115.65 port 65070 ssh2
Sep 30 18:17:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:53.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:53 compute-1 sshd-session[275788]: Received disconnect from 84.51.43.58 port 40453:11: Bye Bye [preauth]
Sep 30 18:17:53 compute-1 sshd-session[275788]: Disconnected from authenticating user root 84.51.43.58 port 40453 [preauth]
Sep 30 18:17:53 compute-1 sshd-session[275867]: Invalid user grid from 14.225.167.110 port 41484
Sep 30 18:17:53 compute-1 sshd-session[275867]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:17:53 compute-1 sshd-session[275867]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:17:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:54.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:54 compute-1 ovn_controller[135204]: 2025-09-30T18:17:54Z|00094|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 18:17:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:54.367 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:17:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:54.367 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:17:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:17:54.368 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:17:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:55 compute-1 ceph-mon[75484]: pgmap v1178: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:17:55 compute-1 sshd-session[275791]: Received disconnect from 194.107.115.65 port 65070:11: Bye Bye [preauth]
Sep 30 18:17:55 compute-1 sshd-session[275791]: Disconnected from invalid user dss 194.107.115.65 port 65070 [preauth]
Sep 30 18:17:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:17:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:55.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:17:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:55 compute-1 nova_compute[238822]: 2025-09-30 18:17:55.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:56.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:56 compute-1 sshd-session[275867]: Failed password for invalid user grid from 14.225.167.110 port 41484 ssh2
Sep 30 18:17:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:57 compute-1 nova_compute[238822]: 2025-09-30 18:17:57.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:17:57 compute-1 ceph-mon[75484]: pgmap v1179: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:17:57 compute-1 nova_compute[238822]: 2025-09-30 18:17:57.390 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Check if temp file /var/lib/nova/instances/tmpwyoivnak exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 18:17:57 compute-1 nova_compute[238822]: 2025-09-30 18:17:57.399 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwyoivnak',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='7f660b4a-3177-4f85-985d-90a46be506e6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 18:17:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:57.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:17:58.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/407820157' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:17:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/407820157' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:17:58 compute-1 sshd-session[275867]: Received disconnect from 14.225.167.110 port 41484:11: Bye Bye [preauth]
Sep 30 18:17:58 compute-1 sshd-session[275867]: Disconnected from invalid user grid 14.225.167.110 port 41484 [preauth]
Sep 30 18:17:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:17:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:17:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:17:59 compute-1 ceph-mon[75484]: pgmap v1180: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:17:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:17:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6754 writes, 34K keys, 6754 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s
                                           Cumulative WAL: 6754 writes, 6754 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1541 writes, 7715 keys, 1541 commit groups, 1.0 writes per commit group, ingest: 16.52 MB, 0.03 MB/s
                                           Interval WAL: 1541 writes, 1541 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    139.2      0.33              0.17        18    0.018       0      0       0.0       0.0
                                             L6      1/0   12.02 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.1    173.7    147.4      1.27              0.63        17    0.075     89K   9300       0.0       0.0
                                            Sum      1/0   12.02 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.1    138.2    145.7      1.59              0.80        35    0.046     89K   9300       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.5    159.3    162.9      0.36              0.20         8    0.045     25K   2576       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    173.7    147.4      1.27              0.63        17    0.075     89K   9300       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    140.1      0.32              0.17        17    0.019       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.044, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.23 GB write, 0.10 MB/s write, 0.22 GB read, 0.09 MB/s read, 1.6 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 20.93 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.00026 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1194,20.25 MB,6.66278%) FilterBlock(35,251.05 KB,0.0806457%) IndexBlock(35,443.11 KB,0.142343%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:17:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:17:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:17:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:17:59.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:17:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:17:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:17:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:17:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:00.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:00 compute-1 ceph-mon[75484]: pgmap v1181: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:18:00 compute-1 podman[275880]: 2025-09-30 18:18:00.547970457 +0000 UTC m=+0.086770379 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:18:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:00 compute-1 nova_compute[238822]: 2025-09-30 18:18:00.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:01.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:01 compute-1 nova_compute[238822]: 2025-09-30 18:18:01.663 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Preparing to wait for external event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:18:01 compute-1 nova_compute[238822]: 2025-09-30 18:18:01.664 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:01 compute-1 nova_compute[238822]: 2025-09-30 18:18:01.664 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:01 compute-1 nova_compute[238822]: 2025-09-30 18:18:01.665 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:02.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:02 compute-1 nova_compute[238822]: 2025-09-30 18:18:02.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:02 compute-1 sshd-session[275901]: Invalid user ubuntu from 110.42.70.108 port 48990
Sep 30 18:18:02 compute-1 sshd-session[275901]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:18:02 compute-1 sshd-session[275901]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=110.42.70.108
Sep 30 18:18:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:03 compute-1 ceph-mon[75484]: pgmap v1182: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:18:03 compute-1 sshd-session[275904]: Invalid user valera from 175.126.165.170 port 45432
Sep 30 18:18:03 compute-1 sshd-session[275904]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:18:03 compute-1 sshd-session[275904]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:18:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:03.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:03 compute-1 sshd-session[275901]: Failed password for invalid user ubuntu from 110.42.70.108 port 48990 ssh2
Sep 30 18:18:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:04.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:04 compute-1 ceph-mon[75484]: pgmap v1183: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:18:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:05 compute-1 sshd-session[275904]: Failed password for invalid user valera from 175.126.165.170 port 45432 ssh2
Sep 30 18:18:05 compute-1 podman[275911]: 2025-09-30 18:18:05.548967145 +0000 UTC m=+0.089878103 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Sep 30 18:18:05 compute-1 podman[275912]: 2025-09-30 18:18:05.548758319 +0000 UTC m=+0.086013369 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:18:05 compute-1 podman[275910]: 2025-09-30 18:18:05.589885722 +0000 UTC m=+0.123820842 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:18:05 compute-1 podman[249638]: time="2025-09-30T18:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:18:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39666 "" "Go-http-client/1.1"
Sep 30 18:18:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:05.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9291 "" "Go-http-client/1.1"
Sep 30 18:18:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:05 compute-1 nova_compute[238822]: 2025-09-30 18:18:05.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:06.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.393 2 DEBUG nova.compute.manager [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.394 2 DEBUG oslo_concurrency.lockutils [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.394 2 DEBUG oslo_concurrency.lockutils [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.394 2 DEBUG oslo_concurrency.lockutils [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.395 2 DEBUG nova.compute.manager [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No event matching network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 in dict_keys([('network-vif-plugged', '57967583-5fed-40cd-bc09-455163ece536')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 18:18:06 compute-1 nova_compute[238822]: 2025-09-30 18:18:06.395 2 DEBUG nova.compute.manager [req-b81fd4fe-c2e2-4193-9cda-5c3467b8d96a req-72adbcba-cdde-4c62-a3c6-2f9b018647ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:18:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:07 compute-1 sshd-session[275904]: Received disconnect from 175.126.165.170 port 45432:11: Bye Bye [preauth]
Sep 30 18:18:07 compute-1 sshd-session[275904]: Disconnected from invalid user valera 175.126.165.170 port 45432 [preauth]
Sep 30 18:18:07 compute-1 nova_compute[238822]: 2025-09-30 18:18:07.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:18:07 compute-1 ceph-mon[75484]: pgmap v1184: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:18:07 compute-1 unix_chkpwd[275972]: password check failed for user (root)
Sep 30 18:18:07 compute-1 sshd-session[275909]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:18:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:07.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:08.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.459 2 DEBUG nova.compute.manager [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.459 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.460 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.460 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.460 2 DEBUG nova.compute.manager [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Processing event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.461 2 DEBUG nova.compute.manager [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-changed-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.461 2 DEBUG nova.compute.manager [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Refreshing instance network info cache due to event network-changed-57967583-5fed-40cd-bc09-455163ece536. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.461 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.462 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.462 2 DEBUG nova.network.neutron [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Refreshing network info cache for port 57967583-5fed-40cd-bc09-455163ece536 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:18:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.696 2 INFO nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Took 7.03 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.697 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:18:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:08 compute-1 nova_compute[238822]: 2025-09-30 18:18:08.973 2 WARNING neutronclient.v2_0.client [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:18:09 compute-1 sshd-session[275909]: Failed password for root from 192.210.160.141 port 53210 ssh2
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.206 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwyoivnak',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='7f660b4a-3177-4f85-985d-90a46be506e6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0a195b11-70ab-4f2d-bee5-ef368a35b00b),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.211 2 DEBUG nova.objects.instance [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f660b4a-3177-4f85-985d-90a46be506e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.213 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.215 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.216 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:18:09 compute-1 ceph-mon[75484]: pgmap v1185: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.370 2 WARNING neutronclient.v2_0.client [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:18:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:09.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.719 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.719 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.730 2 DEBUG nova.virt.libvirt.vif [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1361986283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1361986283',id=11,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:17:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eddde596e2d64cec889cb4c4d3642bc5',ramdisk_id='',reservation_id='r-reelvu4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1755756413',owner_user_name='tempest-TestExecuteBasicStrategy-1755756413-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:17:26Z,user_data=None,user_id='54973270e5a040c8af5ec2225e3caec8',uuid=7f660b4a-3177-4f85-985d-90a46be506e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.730 2 DEBUG nova.network.os_vif_util [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.731 2 DEBUG nova.network.os_vif_util [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.732 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <mac address="fa:16:3e:1d:cc:b5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <model type="virtio"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <mtu size="1442"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <target dev="tap57967583-5f"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]: </interface>
Sep 30 18:18:09 compute-1 nova_compute[238822]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.733 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <name>instance-0000000b</name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <uuid>7f660b4a-3177-4f85-985d-90a46be506e6</uuid>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1361986283</nova:name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:17:19</nova:creationTime>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:18:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:user uuid="54973270e5a040c8af5ec2225e3caec8">tempest-TestExecuteBasicStrategy-1755756413-project-admin</nova:user>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:project uuid="eddde596e2d64cec889cb4c4d3642bc5">tempest-TestExecuteBasicStrategy-1755756413</nova:project>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:port uuid="57967583-5fed-40cd-bc09-455163ece536">
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="serial">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="uuid">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk.config">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:1d:cc:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap57967583-5f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </target>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </console>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </input>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]: </domain>
Sep 30 18:18:09 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.735 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <name>instance-0000000b</name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <uuid>7f660b4a-3177-4f85-985d-90a46be506e6</uuid>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1361986283</nova:name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:17:19</nova:creationTime>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:user uuid="54973270e5a040c8af5ec2225e3caec8">tempest-TestExecuteBasicStrategy-1755756413-project-admin</nova:user>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:project uuid="eddde596e2d64cec889cb4c4d3642bc5">tempest-TestExecuteBasicStrategy-1755756413</nova:project>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:port uuid="57967583-5fed-40cd-bc09-455163ece536">
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="serial">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="uuid">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk.config">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:1d:cc:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap57967583-5f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </target>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </console>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </input>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]: </domain>
Sep 30 18:18:09 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.736 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <name>instance-0000000b</name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <uuid>7f660b4a-3177-4f85-985d-90a46be506e6</uuid>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1361986283</nova:name>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:17:19</nova:creationTime>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:user uuid="54973270e5a040c8af5ec2225e3caec8">tempest-TestExecuteBasicStrategy-1755756413-project-admin</nova:user>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:project uuid="eddde596e2d64cec889cb4c4d3642bc5">tempest-TestExecuteBasicStrategy-1755756413</nova:project>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <nova:port uuid="57967583-5fed-40cd-bc09-455163ece536">
Sep 30 18:18:09 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="serial">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="uuid">7f660b4a-3177-4f85-985d-90a46be506e6</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </system>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </os>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </features>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/7f660b4a-3177-4f85-985d-90a46be506e6_disk.config">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </source>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:1d:cc:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap57967583-5f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:18:09 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       </target>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6/console.log" append="off"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </console>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </input>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </video>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:18:09 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:18:09 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:18:09 compute-1 nova_compute[238822]: </domain>
Sep 30 18:18:09 compute-1 nova_compute[238822]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 18:18:09 compute-1 nova_compute[238822]: 2025-09-30 18:18:09.736 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 18:18:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.022 2 DEBUG nova.network.neutron [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updated VIF entry in instance network info cache for port 57967583-5fed-40cd-bc09-455163ece536. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.023 2 DEBUG nova.network.neutron [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Updating instance_info_cache with network_info: [{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:18:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:10.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.223 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Current None elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.224 2 INFO nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 18:18:10 compute-1 ceph-mon[75484]: pgmap v1186: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.530 2 DEBUG oslo_concurrency.lockutils [req-d8910c92-8429-43c3-900a-16c7520220ce req-c6f1d061-a9cc-422b-a110-0a1fa1e13bcd 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-7f660b4a-3177-4f85-985d-90a46be506e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:18:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:10 compute-1 nova_compute[238822]: 2025-09-30 18:18:10.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:10 compute-1 sshd-session[275909]: Connection closed by authenticating user root 192.210.160.141 port 53210 [preauth]
Sep 30 18:18:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:11 compute-1 nova_compute[238822]: 2025-09-30 18:18:11.243 2 INFO nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 18:18:11 compute-1 sudo[275977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:18:11 compute-1 sudo[275977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:11 compute-1 sudo[275977]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:11.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:11 compute-1 nova_compute[238822]: 2025-09-30 18:18:11.747 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:18:11 compute-1 nova_compute[238822]: 2025-09-30 18:18:11.748 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 18:18:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:12.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:12 compute-1 nova_compute[238822]: 2025-09-30 18:18:12.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:12 compute-1 nova_compute[238822]: 2025-09-30 18:18:12.250 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:18:12 compute-1 nova_compute[238822]: 2025-09-30 18:18:12.251 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 18:18:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:12 compute-1 nova_compute[238822]: 2025-09-30 18:18:12.757 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:18:12 compute-1 nova_compute[238822]: 2025-09-30 18:18:12.758 2 DEBUG nova.virt.libvirt.migration [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 18:18:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:13 compute-1 kernel: tap57967583-5f (unregistering): left promiscuous mode
Sep 30 18:18:13 compute-1 NetworkManager[45549]: <info>  [1759256293.0293] device (tap57967583-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:18:13 compute-1 ovn_controller[135204]: 2025-09-30T18:18:13Z|00095|binding|INFO|Releasing lport 57967583-5fed-40cd-bc09-455163ece536 from this chassis (sb_readonly=0)
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 ovn_controller[135204]: 2025-09-30T18:18:13Z|00096|binding|INFO|Setting lport 57967583-5fed-40cd-bc09-455163ece536 down in Southbound
Sep 30 18:18:13 compute-1 ovn_controller[135204]: 2025-09-30T18:18:13Z|00097|binding|INFO|Removing iface tap57967583-5f ovn-installed in OVS
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.053 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:cc:b5 10.100.0.6'], port_security=['fa:16:3e:1d:cc:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0398922-aff5-46ba-afa7-58d09e28293c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7f660b4a-3177-4f85-985d-90a46be506e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eddde596e2d64cec889cb4c4d3642bc5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f566abc7-3fe4-4e56-86df-377c1571ec04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afc38829-13e1-4bde-91a7-790387f17ce5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=57967583-5fed-40cd-bc09-455163ece536) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.055 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 57967583-5fed-40cd-bc09-455163ece536 in datapath cd077ee2-d26f-4989-8ea7-4aecbac7c636 unbound from our chassis
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.056 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd077ee2-d26f-4989-8ea7-4aecbac7c636, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.058 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[80974987-6b87-4bc8-9a9a-a4e13ab65abd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.059 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636 namespace which is not needed anymore
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Sep 30 18:18:13 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 15.646s CPU time.
Sep 30 18:18:13 compute-1 systemd-machined[195911]: Machine qemu-7-instance-0000000b terminated.
Sep 30 18:18:13 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_selinux on 7f660b4a-3177-4f85-985d-90a46be506e6_disk: No such file or directory
Sep 30 18:18:13 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_dac on 7f660b4a-3177-4f85-985d-90a46be506e6_disk: No such file or directory
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.165 2 DEBUG nova.compute.manager [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.166 2 DEBUG oslo_concurrency.lockutils [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.166 2 DEBUG oslo_concurrency.lockutils [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.166 2 DEBUG oslo_concurrency.lockutils [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.167 2 DEBUG nova.compute.manager [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.167 2 DEBUG nova.compute.manager [req-110de66d-fde8-4034-8b59-0f3694556259 req-6835c130-145b-42f7-bb06-b4573da443ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.197 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.197 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.198 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 18:18:13 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [NOTICE]   (275594) : haproxy version is 3.0.5-8e879a5
Sep 30 18:18:13 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [NOTICE]   (275594) : path to executable is /usr/sbin/haproxy
Sep 30 18:18:13 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [WARNING]  (275594) : Exiting Master process...
Sep 30 18:18:13 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [ALERT]    (275594) : Current worker (275596) exited with code 143 (Terminated)
Sep 30 18:18:13 compute-1 neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636[275590]: [WARNING]  (275594) : All workers exited. Exiting... (0)
Sep 30 18:18:13 compute-1 podman[276039]: 2025-09-30 18:18:13.2564373 +0000 UTC m=+0.042074130 container kill 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 18:18:13 compute-1 systemd[1]: libpod-6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c.scope: Deactivated successfully.
Sep 30 18:18:13 compute-1 ceph-mon[75484]: pgmap v1187: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.261 2 DEBUG nova.virt.libvirt.guest [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '7f660b4a-3177-4f85-985d-90a46be506e6' (instance-0000000b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.262 2 INFO nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migration operation has completed
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.262 2 INFO nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] _post_live_migration() is started..
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.273 2 WARNING neutronclient.v2_0.client [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.273 2 WARNING neutronclient.v2_0.client [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:18:13 compute-1 podman[276057]: 2025-09-30 18:18:13.325859409 +0000 UTC m=+0.042812650 container died 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:18:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c-userdata-shm.mount: Deactivated successfully.
Sep 30 18:18:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-257e53896924943172c009234c41b7c62fc8771abba51e56de9cca94c25f99b0-merged.mount: Deactivated successfully.
Sep 30 18:18:13 compute-1 podman[276057]: 2025-09-30 18:18:13.381501104 +0000 UTC m=+0.098454315 container cleanup 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:18:13 compute-1 systemd[1]: libpod-conmon-6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c.scope: Deactivated successfully.
Sep 30 18:18:13 compute-1 podman[276059]: 2025-09-30 18:18:13.407420506 +0000 UTC m=+0.109855884 container remove 6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.414 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6b172dc2-0334-4aef-baa2-a03eb0bd646e]: (4, ("Tue Sep 30 06:18:13 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636 (6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c)\n6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c\nTue Sep 30 06:18:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636 (6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c)\n6f9615fa6c6c034be2cfa6f5aa6f2daf0a0d4dcfb070f1a7dc55c0568a10863c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.416 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d3221aa2-b74f-4aba-a2ea-a12324c0731d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.416 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd077ee2-d26f-4989-8ea7-4aecbac7c636.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.417 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[734654d4-38ee-4a76-8f90-22cfa01ac7a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.418 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd077ee2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 kernel: tapcd077ee2-d0: left promiscuous mode
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.456 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[74c58460-cca3-4550-91a9-e006457c5094]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.485 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[48e3c08a-4c54-45ae-a706-026626327a1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.486 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2a8daa-ab51-4e8d-930f-625a1a242ab3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.511 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9760e809-521c-4f2e-a9a1-8cfdb7968994]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1408544, 'reachable_time': 37745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276089, 'error': None, 'target': 'ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.514 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd077ee2-d26f-4989-8ea7-4aecbac7c636 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:18:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:13.514 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd92e04-8be1-4ed0-8243-0f6dde222cb4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:13 compute-1 systemd[1]: run-netns-ovnmeta\x2dcd077ee2\x2dd26f\x2d4989\x2d8ea7\x2d4aecbac7c636.mount: Deactivated successfully.
Sep 30 18:18:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:13.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.695 2 DEBUG nova.network.neutron [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Activated binding for port 57967583-5fed-40cd-bc09-455163ece536 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.696 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.697 2 DEBUG nova.virt.libvirt.vif [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1361986283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1361986283',id=11,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:17:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eddde596e2d64cec889cb4c4d3642bc5',ramdisk_id='',reservation_id='r-reelvu4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1755756413',owner_user_name='tempest-TestExecuteBasicStrategy-1755756413-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:17:52Z,user_data=None,user_id='54973270e5a040c8af5ec2225e3caec8',uuid=7f660b4a-3177-4f85-985d-90a46be506e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.697 2 DEBUG nova.network.os_vif_util [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "57967583-5fed-40cd-bc09-455163ece536", "address": "fa:16:3e:1d:cc:b5", "network": {"id": "cd077ee2-d26f-4989-8ea7-4aecbac7c636", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1165176741-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d66b07a980744cd29ee547eb08500706", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57967583-5f", "ovs_interfaceid": "57967583-5fed-40cd-bc09-455163ece536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.698 2 DEBUG nova.network.os_vif_util [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.699 2 DEBUG os_vif [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57967583-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.706 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e34b8e02-2189-48d8-bcda-8092db881483) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.711 2 INFO os_vif [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:cc:b5,bridge_name='br-int',has_traffic_filtering=True,id=57967583-5fed-40cd-bc09-455163ece536,network=Network(cd077ee2-d26f-4989-8ea7-4aecbac7c636),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57967583-5f')
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.712 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.712 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.712 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.713 2 DEBUG nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.713 2 INFO nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Deleting instance files /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6_del
Sep 30 18:18:13 compute-1 nova_compute[238822]: 2025-09-30 18:18:13.714 2 INFO nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Deletion of /var/lib/nova/instances/7f660b4a-3177-4f85-985d-90a46be506e6_del complete
Sep 30 18:18:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:14.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.231 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.232 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.233 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.233 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.234 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.234 2 WARNING nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received unexpected event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 for instance with vm_state active and task_state migrating.
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.235 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.235 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.236 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.236 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.237 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.237 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.238 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.238 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.239 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.239 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.240 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.240 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-unplugged-57967583-5fed-40cd-bc09-455163ece536 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.241 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.241 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.242 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.242 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.242 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.243 2 WARNING nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received unexpected event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 for instance with vm_state active and task_state migrating.
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.243 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.243 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.244 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.244 2 DEBUG oslo_concurrency.lockutils [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.245 2 DEBUG nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] No waiting events found dispatching network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:18:15 compute-1 nova_compute[238822]: 2025-09-30 18:18:15.245 2 WARNING nova.compute.manager [req-f2d8a0dd-9a1c-4a52-800b-00af548f9117 req-bf80a768-867b-4871-b624-a5fea0df3a64 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Received unexpected event network-vif-plugged-57967583-5fed-40cd-bc09-455163ece536 for instance with vm_state active and task_state migrating.
Sep 30 18:18:15 compute-1 ceph-mon[75484]: pgmap v1188: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:18:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:15.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c4009090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:16.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:16 compute-1 unix_chkpwd[276096]: password check failed for user (root)
Sep 30 18:18:16 compute-1 sshd-session[276092]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:18:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:17 compute-1 nova_compute[238822]: 2025-09-30 18:18:17.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:17 compute-1 ceph-mon[75484]: pgmap v1189: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:18:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:17.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:18.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:18 compute-1 ceph-mon[75484]: pgmap v1190: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:18:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:18 compute-1 nova_compute[238822]: 2025-09-30 18:18:18.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:19 compute-1 sshd-session[276092]: Failed password for root from 216.10.242.161 port 55214 ssh2
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: ERROR   18:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: ERROR   18:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: ERROR   18:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: ERROR   18:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: ERROR   18:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:18:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:18:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:19.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:19 compute-1 sshd-session[276092]: Received disconnect from 216.10.242.161 port 55214:11: Bye Bye [preauth]
Sep 30 18:18:19 compute-1 sshd-session[276092]: Disconnected from authenticating user root 216.10.242.161 port 55214 [preauth]
Sep 30 18:18:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:20.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:20 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:21 compute-1 ceph-mon[75484]: pgmap v1191: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:18:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:21.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:22.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.260 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.261 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.261 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "7f660b4a-3177-4f85-985d-90a46be506e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:22 compute-1 sudo[276103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:18:22 compute-1 sudo[276103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:22 compute-1 sudo[276103]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:22 compute-1 sudo[276135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:18:22 compute-1 sudo[276135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.786 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.787 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.788 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.788 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:18:22 compute-1 nova_compute[238822]: 2025-09-30 18:18:22.789 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:18:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 12K writes, 50K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 12K writes, 3569 syncs, 3.64 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4110 writes, 16K keys, 4110 commit groups, 1.0 writes per commit group, ingest: 20.48 MB, 0.03 MB/s
                                           Interval WAL: 4110 writes, 1498 syncs, 2.74 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:18:22 compute-1 podman[276128]: 2025-09-30 18:18:22.805511729 +0000 UTC m=+0.121371096 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:18:22 compute-1 podman[276127]: 2025-09-30 18:18:22.842356626 +0000 UTC m=+0.162902689 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:18:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:22 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:23 compute-1 sudo[276135]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:18:23 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4051419468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.282 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:23 compute-1 ceph-mon[75484]: pgmap v1192: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4051419468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:23 compute-1 sudo[276243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:18:23 compute-1 sudo[276243]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:23 compute-1 sudo[276243]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:23 compute-1 sudo[276271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:18:23 compute-1 sudo[276271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.483 2 WARNING nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.484 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.519 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.520 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4705MB free_disk=39.90111541748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.520 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.521 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:23.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:23 compute-1 nova_compute[238822]: 2025-09-30 18:18:23.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0c400a760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:24 compute-1 sudo[276271]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:24.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:18:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:18:24 compute-1 nova_compute[238822]: 2025-09-30 18:18:24.543 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration for instance 7f660b4a-3177-4f85-985d-90a46be506e6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:18:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:24 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.053 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.094 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration 0a195b11-70ab-4f2d-bee5-ef368a35b00b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.094 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.095 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:18:23 up  3:55,  0 user,  load average: 0.43, 0.73, 1.04\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.136 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:25 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:25.307 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:18:25 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:25.308 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:25 compute-1 ceph-mon[75484]: pgmap v1193: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:18:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:18:25 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1562792068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.653 2 DEBUG oslo_concurrency.processutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:25 compute-1 nova_compute[238822]: 2025-09-30 18:18:25.661 2 DEBUG nova.compute.provider_tree [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:18:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:18:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:26.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.185 2 DEBUG nova.scheduler.client.report [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:18:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1562792068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:26 compute-1 ceph-mon[75484]: pgmap v1194: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.698 2 DEBUG nova.compute.resource_tracker [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.700 2 DEBUG oslo_concurrency.lockutils [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:26 compute-1 nova_compute[238822]: 2025-09-30 18:18:26.723 2 INFO nova.compute.manager [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 18:18:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:26 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:27 compute-1 nova_compute[238822]: 2025-09-30 18:18:27.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:27 compute-1 nova_compute[238822]: 2025-09-30 18:18:27.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:27.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:27 compute-1 nova_compute[238822]: 2025-09-30 18:18:27.880 2 INFO nova.scheduler.client.report [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Deleted allocation for migration 0a195b11-70ab-4f2d-bee5-ef368a35b00b
Sep 30 18:18:27 compute-1 nova_compute[238822]: 2025-09-30 18:18:27.881 2 DEBUG nova.virt.libvirt.driver [None req-812acc71-08bf-4d8e-9c71-e0d0e609aa7d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 7f660b4a-3177-4f85-985d-90a46be506e6] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 18:18:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:28.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:28.315 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:18:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:28 compute-1 nova_compute[238822]: 2025-09-30 18:18:28.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:28 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:29 compute-1 ceph-mon[75484]: pgmap v1195: 353 pgs: 353 active+clean; 200 MiB data, 326 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:29.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:30.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:30 compute-1 ceph-mon[75484]: pgmap v1196: 353 pgs: 353 active+clean; 121 MiB data, 286 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 9.2 KiB/s wr, 29 op/s
Sep 30 18:18:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:30 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:31 compute-1 sudo[276361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:18:31 compute-1 sudo[276361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:31 compute-1 sudo[276361]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:31 compute-1 podman[276385]: 2025-09-30 18:18:31.142359975 +0000 UTC m=+0.090559692 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:18:31 compute-1 unix_chkpwd[276408]: password check failed for user (root)
Sep 30 18:18:31 compute-1 sshd-session[276405]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104  user=root
Sep 30 18:18:31 compute-1 sudo[276409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:18:31 compute-1 sudo[276409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:31 compute-1 sudo[276409]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.582 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.582 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.583 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.583 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:18:31 compute-1 nova_compute[238822]: 2025-09-30 18:18:31.583 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:31.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:18:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3283538771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:18:32 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1937590608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.176 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.592s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:32.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.422 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.424 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.475 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.476 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4704MB free_disk=39.946632385253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.476 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:32 compute-1 nova_compute[238822]: 2025-09-30 18:18:32.477 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1937590608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:32 compute-1 ceph-mon[75484]: pgmap v1197: 353 pgs: 353 active+clean; 121 MiB data, 286 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 9.2 KiB/s wr, 29 op/s
Sep 30 18:18:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:32 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:33 compute-1 sshd-session[276405]: Failed password for root from 107.172.146.104 port 46520 ssh2
Sep 30 18:18:33 compute-1 unix_chkpwd[276460]: password check failed for user (root)
Sep 30 18:18:33 compute-1 sshd-session[276407]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:18:33 compute-1 nova_compute[238822]: 2025-09-30 18:18:33.523 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:18:33 compute-1 nova_compute[238822]: 2025-09-30 18:18:33.523 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:18:32 up  3:55,  0 user,  load average: 0.37, 0.71, 1.03\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:18:33 compute-1 nova_compute[238822]: 2025-09-30 18:18:33.537 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:18:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:33.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:33 compute-1 nova_compute[238822]: 2025-09-30 18:18:33.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:18:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4084878060' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:34 compute-1 nova_compute[238822]: 2025-09-30 18:18:34.025 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:18:34 compute-1 nova_compute[238822]: 2025-09-30 18:18:34.035 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:18:34 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4084878060' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:34 compute-1 sshd-session[276405]: Received disconnect from 107.172.146.104 port 46520:11: Bye Bye [preauth]
Sep 30 18:18:34 compute-1 sshd-session[276405]: Disconnected from authenticating user root 107.172.146.104 port 46520 [preauth]
Sep 30 18:18:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:34.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:34 compute-1 nova_compute[238822]: 2025-09-30 18:18:34.547 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:18:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:34 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2856771135' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:35 compute-1 ceph-mon[75484]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:18:35 compute-1 nova_compute[238822]: 2025-09-30 18:18:35.062 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:18:35 compute-1 nova_compute[238822]: 2025-09-30 18:18:35.062 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.586s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:35 compute-1 sshd-session[276407]: Failed password for root from 192.210.160.141 port 42414 ssh2
Sep 30 18:18:35 compute-1 podman[249638]: time="2025-09-30T18:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:18:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:18:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8829 "" "Go-http-client/1.1"
Sep 30 18:18:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:35.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4172187211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:18:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:36.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:36 compute-1 podman[276487]: 2025-09-30 18:18:36.581609662 +0000 UTC m=+0.103016369 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 18:18:36 compute-1 podman[276488]: 2025-09-30 18:18:36.582582858 +0000 UTC m=+0.097853269 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:18:36 compute-1 podman[276489]: 2025-09-30 18:18:36.62034904 +0000 UTC m=+0.133684279 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 18:18:36 compute-1 sshd-session[276407]: Connection closed by authenticating user root 192.210.160.141 port 42414 [preauth]
Sep 30 18:18:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:36 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:37 compute-1 ceph-mon[75484]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:18:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3601153906' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:18:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3601153906' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:18:37 compute-1 nova_compute[238822]: 2025-09-30 18:18:37.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:37.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:18:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:38.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:38 compute-1 nova_compute[238822]: 2025-09-30 18:18:38.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:38 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:18:39 compute-1 ceph-mon[75484]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.305 2 DEBUG nova.compute.manager [None req-f7da0ac9-72d7-4879-8e8c-bbe1fc99a823 e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 18:18:39 compute-1 nova_compute[238822]: 2025-09-30 18:18:39.377 2 DEBUG nova.compute.provider_tree [None req-f7da0ac9-72d7-4879-8e8c-bbe1fc99a823 e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 11 to 14 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:18:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:39.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:40 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:41 compute-1 ceph-mon[75484]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:18:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:41.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:42.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:42 compute-1 nova_compute[238822]: 2025-09-30 18:18:42.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:42 compute-1 nova_compute[238822]: 2025-09-30 18:18:42.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:42 compute-1 ceph-mon[75484]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:18:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:42 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:43.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:43 compute-1 nova_compute[238822]: 2025-09-30 18:18:43.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:18:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:44.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:18:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:44 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:45 compute-1 ceph-mon[75484]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:18:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:45.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:46.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:46 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:47 compute-1 ceph-mon[75484]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:47 compute-1 nova_compute[238822]: 2025-09-30 18:18:47.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:47.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:48.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:48 compute-1 nova_compute[238822]: 2025-09-30 18:18:48.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:48 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:49 compute-1 ceph-mon[75484]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: ERROR   18:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: ERROR   18:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: ERROR   18:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: ERROR   18:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: ERROR   18:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:18:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:18:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:50.140 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:2f:79 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4967e90f79a346799e6308bad2720c19', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24ff55c2-8d35-4c40-8cdb-b69990431aae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=db72a019-8f7d-4b59-bddf-b93d21d66f5c) old=Port_Binding(mac=['fa:16:3e:ce:2f:79'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4967e90f79a346799e6308bad2720c19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:18:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:50.141 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port db72a019-8f7d-4b59-bddf-b93d21d66f5c in datapath fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c updated
Sep 30 18:18:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:50.141 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb58e9fa-a47f-46f8-8fc6-4c39220a3c7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:18:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:50.142 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4e94139d-221e-4d3d-8f23-e376dfbd1d12]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:50.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:50 compute-1 ceph-mon[75484]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:18:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:50 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:51 compute-1 sshd-session[276553]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:18:51 compute-1 sshd-session[276553]: banner exchange: Connection from 113.249.93.94 port 54350: Connection timed out
Sep 30 18:18:51 compute-1 sudo[276564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:18:51 compute-1 sudo[276564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:18:51 compute-1 sudo[276564]: pam_unix(sudo:session): session closed for user root
Sep 30 18:18:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:51.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:52.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:52 compute-1 nova_compute[238822]: 2025-09-30 18:18:52.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:52 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:53 compute-1 ceph-mon[75484]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:18:53 compute-1 podman[276592]: 2025-09-30 18:18:53.548917879 +0000 UTC m=+0.088821644 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:18:53 compute-1 podman[276591]: 2025-09-30 18:18:53.577890553 +0000 UTC m=+0.114505529 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:18:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:53.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:53 compute-1 nova_compute[238822]: 2025-09-30 18:18:53.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:54.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:54.369 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:18:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:54.370 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:18:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:54.370 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:18:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:54 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:55 compute-1 ceph-mon[75484]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:18:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:55.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:56.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:56 compute-1 ceph-mon[75484]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:56 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:57 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:57.158 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:c3:8a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-813545df-e959-4c8c-a60c-e9381ec1d1af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-813545df-e959-4c8c-a60c-e9381ec1d1af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6dd6f492f1f4129bcd0c59ee535a610', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad94911-5503-46ce-bc71-e5a28f8991cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d9446fba-5ece-445f-b9a1-52303f9bbf8d) old=Port_Binding(mac=['fa:16:3e:be:c3:8a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-813545df-e959-4c8c-a60c-e9381ec1d1af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-813545df-e959-4c8c-a60c-e9381ec1d1af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6dd6f492f1f4129bcd0c59ee535a610', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:18:57 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:57.159 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d9446fba-5ece-445f-b9a1-52303f9bbf8d in datapath 813545df-e959-4c8c-a60c-e9381ec1d1af updated
Sep 30 18:18:57 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:57.160 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 813545df-e959-4c8c-a60c-e9381ec1d1af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:18:57 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:18:57.161 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[047f8ef2-236f-43d9-9ffe-64ad0b664627]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:18:57 compute-1 unix_chkpwd[276650]: password check failed for user (root)
Sep 30 18:18:57 compute-1 sshd-session[276645]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:18:57 compute-1 nova_compute[238822]: 2025-09-30 18:18:57.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3555197970' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:18:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3555197970' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:18:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:57.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:18:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:18:58.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:18:58 compute-1 ceph-mon[75484]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:18:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:18:58 compute-1 nova_compute[238822]: 2025-09-30 18:18:58.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:18:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:58 compute-1 sshd-session[276645]: Failed password for root from 194.107.115.65 port 33040 ssh2
Sep 30 18:18:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:18:58 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:18:59 compute-1 sshd-session[276648]: Invalid user backups from 192.210.160.141 port 57578
Sep 30 18:18:59 compute-1 sshd-session[276648]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:18:59 compute-1 sshd-session[276648]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:18:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:18:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:18:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:18:59.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:18:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:18:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:18:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:18:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:00 compute-1 sshd-session[276645]: Received disconnect from 194.107.115.65 port 33040:11: Bye Bye [preauth]
Sep 30 18:19:00 compute-1 sshd-session[276645]: Disconnected from authenticating user root 194.107.115.65 port 33040 [preauth]
Sep 30 18:19:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:00.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:00 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:01 compute-1 unix_chkpwd[276657]: password check failed for user (root)
Sep 30 18:19:01 compute-1 sshd-session[276649]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=23.137.255.140  user=root
Sep 30 18:19:01 compute-1 ceph-mon[75484]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:01 compute-1 sshd-session[276648]: Failed password for invalid user backups from 192.210.160.141 port 57578 ssh2
Sep 30 18:19:01 compute-1 podman[276658]: 2025-09-30 18:19:01.55777528 +0000 UTC m=+0.096316027 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Sep 30 18:19:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:01.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:02.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:02 compute-1 nova_compute[238822]: 2025-09-30 18:19:02.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:02 compute-1 sshd-session[276648]: Connection closed by invalid user backups 192.210.160.141 port 57578 [preauth]
Sep 30 18:19:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:02 compute-1 sshd-session[276649]: Failed password for root from 23.137.255.140 port 2350 ssh2
Sep 30 18:19:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:02 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:03 compute-1 ceph-mon[75484]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:03 compute-1 nova_compute[238822]: 2025-09-30 18:19:03.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:03.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:04.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:04 compute-1 sshd-session[276649]: Received disconnect from 23.137.255.140 port 2350:11: Bye Bye [preauth]
Sep 30 18:19:04 compute-1 sshd-session[276649]: Disconnected from authenticating user root 23.137.255.140 port 2350 [preauth]
Sep 30 18:19:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:04 compute-1 unix_chkpwd[276683]: password check failed for user (root)
Sep 30 18:19:04 compute-1 sshd-session[276679]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:19:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:04 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_46] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0b8002ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:05 compute-1 ceph-mon[75484]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:05 compute-1 podman[249638]: time="2025-09-30T18:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:19:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:19:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8826 "" "Go-http-client/1.1"
Sep 30 18:19:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:06.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:06 compute-1 ceph-mon[75484]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:06 compute-1 sshd-session[276679]: Failed password for root from 14.225.167.110 port 59180 ssh2
Sep 30 18:19:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:06 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:07 compute-1 nova_compute[238822]: 2025-09-30 18:19:07.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:19:07 compute-1 podman[276688]: 2025-09-30 18:19:07.563866057 +0000 UTC m=+0.101379105 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:19:07 compute-1 podman[276689]: 2025-09-30 18:19:07.573928089 +0000 UTC m=+0.100141571 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, io.openshift.tags=minimal rhel9)
Sep 30 18:19:07 compute-1 podman[276690]: 2025-09-30 18:19:07.582789659 +0000 UTC m=+0.105497616 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:19:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:07.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:07 compute-1 sshd-session[276679]: Received disconnect from 14.225.167.110 port 59180:11: Bye Bye [preauth]
Sep 30 18:19:07 compute-1 sshd-session[276679]: Disconnected from authenticating user root 14.225.167.110 port 59180 [preauth]
Sep 30 18:19:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0bc002900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:08.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:08 compute-1 sshd-session[276686]: Invalid user unidata from 84.51.43.58 port 36286
Sep 30 18:19:08 compute-1 sshd-session[276686]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:19:08 compute-1 sshd-session[276686]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:19:08 compute-1 ceph-mon[75484]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:08 compute-1 nova_compute[238822]: 2025-09-30 18:19:08.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:08 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:09.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:09 compute-1 sshd-session[276686]: Failed password for invalid user unidata from 84.51.43.58 port 36286 ssh2
Sep 30 18:19:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:10.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:10 compute-1 sshd-session[276686]: Received disconnect from 84.51.43.58 port 36286:11: Bye Bye [preauth]
Sep 30 18:19:10 compute-1 sshd-session[276686]: Disconnected from invalid user unidata 84.51.43.58 port 36286 [preauth]
Sep 30 18:19:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:10 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:11 compute-1 ceph-mon[75484]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:11.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:11 compute-1 sudo[276756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:19:11 compute-1 sudo[276756]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:11 compute-1 sudo[276756]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:12.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:12 compute-1 nova_compute[238822]: 2025-09-30 18:19:12.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:12 compute-1 unix_chkpwd[276782]: password check failed for user (root)
Sep 30 18:19:12 compute-1 sshd-session[276754]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:19:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:12 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:13 compute-1 ceph-mon[75484]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:13.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:13 compute-1 nova_compute[238822]: 2025-09-30 18:19:13.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:14 compute-1 ceph-mon[75484]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:14 compute-1 sshd-session[276754]: Failed password for root from 175.126.165.170 port 33160 ssh2
Sep 30 18:19:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:14 compute-1 ovn_controller[135204]: 2025-09-30T18:19:14Z|00098|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 18:19:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:14 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe08c0012b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:15 compute-1 sshd-session[276754]: Received disconnect from 175.126.165.170 port 33160:11: Bye Bye [preauth]
Sep 30 18:19:15 compute-1 sshd-session[276754]: Disconnected from authenticating user root 175.126.165.170 port 33160 [preauth]
Sep 30 18:19:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:15.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe094002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:16.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:16 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe098002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:17 compute-1 ceph-mon[75484]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:17 compute-1 nova_compute[238822]: 2025-09-30 18:19:17.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:17.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[267465]: 30/09/2025 18:19:18 : epoch 68dc1cd8 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe0a0002830 fd 38 proxy ignored for local
Sep 30 18:19:18 compute-1 kernel: ganesha.nfsd[276750]: segfault at 50 ip 00007fe16ea9532e sp 00007fe13cff8210 error 4 in libntirpc.so.5.8[7fe16ea7a000+2c000] likely on CPU 7 (core 0, socket 7)
Sep 30 18:19:18 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:19:18 compute-1 systemd[1]: Started Process Core Dump (PID 276789/UID 0).
Sep 30 18:19:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:18.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:18 compute-1 nova_compute[238822]: 2025-09-30 18:19:18.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:19 compute-1 systemd-coredump[276790]: Process 267470 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 95:
                                                    #0  0x00007fe16ea9532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:19:19 compute-1 ceph-mon[75484]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:19 compute-1 systemd[1]: systemd-coredump@11-276789-0.service: Deactivated successfully.
Sep 30 18:19:19 compute-1 systemd[1]: systemd-coredump@11-276789-0.service: Consumed 1.158s CPU time.
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: ERROR   18:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: ERROR   18:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: ERROR   18:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:19:19 compute-1 podman[276796]: 2025-09-30 18:19:19.43238725 +0000 UTC m=+0.054118146 container died 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: ERROR   18:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: ERROR   18:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:19:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:19:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-3c1eb72020226e1400eb74a29d645c88870f63c2b1be10ba3c59ff3cbc2fb979-merged.mount: Deactivated successfully.
Sep 30 18:19:19 compute-1 podman[276796]: 2025-09-30 18:19:19.616688217 +0000 UTC m=+0.238419093 container remove 80d1f8374266f2928b4c61649ec796d24345aff1c2e4693c6168e3504f66733b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:19:19 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:19:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:19.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:19 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:19:19 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.669s CPU time.
Sep 30 18:19:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:20.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:20 compute-1 ceph-mon[75484]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:21.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:22.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:22 compute-1 nova_compute[238822]: 2025-09-30 18:19:22.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:22 compute-1 unix_chkpwd[276843]: password check failed for user (root)
Sep 30 18:19:22 compute-1 sshd-session[276840]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:19:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:23 compute-1 ceph-mon[75484]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:19:23 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:19:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:23 compute-1 podman[276848]: 2025-09-30 18:19:23.757256332 +0000 UTC m=+0.086662226 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:19:23 compute-1 nova_compute[238822]: 2025-09-30 18:19:23.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:23 compute-1 podman[276847]: 2025-09-30 18:19:23.785021704 +0000 UTC m=+0.126446993 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:19:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/181924 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:19:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:24.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:24 compute-1 sshd-session[276840]: Failed password for root from 216.10.242.161 port 37938 ssh2
Sep 30 18:19:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:25 compute-1 unix_chkpwd[276894]: password check failed for user (root)
Sep 30 18:19:25 compute-1 sshd-session[276845]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:19:25 compute-1 ceph-mon[75484]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:19:25 compute-1 sshd-session[276840]: Received disconnect from 216.10.242.161 port 37938:11: Bye Bye [preauth]
Sep 30 18:19:25 compute-1 sshd-session[276840]: Disconnected from authenticating user root 216.10.242.161 port 37938 [preauth]
Sep 30 18:19:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:25.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:26 compute-1 nova_compute[238822]: 2025-09-30 18:19:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:26 compute-1 nova_compute[238822]: 2025-09-30 18:19:26.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:19:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:26.264 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:19:26 compute-1 nova_compute[238822]: 2025-09-30 18:19:26.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:26.266 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:19:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:26.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:26 compute-1 sshd-session[276896]: Invalid user foundry from 107.172.146.104 port 39088
Sep 30 18:19:26 compute-1 sshd-session[276896]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:19:26 compute-1 sshd-session[276896]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.146.104
Sep 30 18:19:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:26 compute-1 sshd-session[276845]: Failed password for root from 192.210.160.141 port 46068 ssh2
Sep 30 18:19:27 compute-1 nova_compute[238822]: 2025-09-30 18:19:27.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:27 compute-1 ceph-mon[75484]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:27 compute-1 nova_compute[238822]: 2025-09-30 18:19:27.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:27.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:28 compute-1 nova_compute[238822]: 2025-09-30 18:19:28.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:28 compute-1 sshd-session[276845]: Connection closed by authenticating user root 192.210.160.141 port 46068 [preauth]
Sep 30 18:19:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:28.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:28 compute-1 ceph-mon[75484]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:28 compute-1 sshd-session[276896]: Failed password for invalid user foundry from 107.172.146.104 port 39088 ssh2
Sep 30 18:19:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:28.686 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:7f:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '269f60e72ce1460a98da519466c89da6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=031d2cff-b142-4423-ba99-772183b7a667) old=Port_Binding(mac=['fa:16:3e:c4:7f:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '269f60e72ce1460a98da519466c89da6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:19:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:28.687 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 031d2cff-b142-4423-ba99-772183b7a667 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b updated
Sep 30 18:19:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:28.689 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 443be7ca-f628-4a45-95b6-620d37172d7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:19:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:28.690 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[eb426038-c037-4d5d-9481-e089aeebf89f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:19:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:28 compute-1 nova_compute[238822]: 2025-09-30 18:19:28.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:29.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:29 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 12.
Sep 30 18:19:29 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:19:29 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 2.669s CPU time.
Sep 30 18:19:29 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 18:19:30 compute-1 sshd-session[276896]: Received disconnect from 107.172.146.104 port 39088:11: Bye Bye [preauth]
Sep 30 18:19:30 compute-1 sshd-session[276896]: Disconnected from invalid user foundry 107.172.146.104 port 39088 [preauth]
Sep 30 18:19:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:30 compute-1 podman[276949]: 2025-09-30 18:19:30.316692091 +0000 UTC m=+0.071312651 container create af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Sep 30 18:19:30 compute-1 podman[276949]: 2025-09-30 18:19:30.276994747 +0000 UTC m=+0.031615367 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:19:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbd792853c30aec2f71562e426b9a8ca8c04825371ca80f454c5cd0c700dc09/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 18:19:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbd792853c30aec2f71562e426b9a8ca8c04825371ca80f454c5cd0c700dc09/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:19:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbd792853c30aec2f71562e426b9a8ca8c04825371ca80f454c5cd0c700dc09/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:19:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbd792853c30aec2f71562e426b9a8ca8c04825371ca80f454c5cd0c700dc09/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:19:30 compute-1 podman[276949]: 2025-09-30 18:19:30.422675239 +0000 UTC m=+0.177295859 container init af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Sep 30 18:19:30 compute-1 podman[276949]: 2025-09-30 18:19:30.430294325 +0000 UTC m=+0.184914885 container start af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Sep 30 18:19:30 compute-1 bash[276949]: af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 18:19:30 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:31 compute-1 nova_compute[238822]: 2025-09-30 18:19:31.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:31 compute-1 sudo[277007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:19:31 compute-1 sudo[277007]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:31 compute-1 sudo[277007]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:31 compute-1 ceph-mon[75484]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:19:31 compute-1 sudo[277032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:19:31 compute-1 sudo[277032]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:31.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:31 compute-1 sudo[277074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:19:31 compute-1 sudo[277074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:31 compute-1 sudo[277074]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:31 compute-1 podman[277101]: 2025-09-30 18:19:31.982731764 +0000 UTC m=+0.086854141 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:19:32 compute-1 sudo[277032]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:32 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:32.269 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:19:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:32.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2976751412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:19:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:19:32 compute-1 nova_compute[238822]: 2025-09-30 18:19:32.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:32 compute-1 nova_compute[238822]: 2025-09-30 18:19:32.563 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.077 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.077 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.078 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.078 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.079 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:19:33 compute-1 ceph-mon[75484]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:19:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2659287100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:19:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/299869197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.601 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:19:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.752 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.753 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:19:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:33.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.774 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.775 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4740MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.775 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.775 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:19:33 compute-1 nova_compute[238822]: 2025-09-30 18:19:33.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:34.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:34 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/299869197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:34 compute-1 nova_compute[238822]: 2025-09-30 18:19:34.823 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:19:34 compute-1 nova_compute[238822]: 2025-09-30 18:19:34.824 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:19:33 up  3:56,  0 user,  load average: 0.21, 0.59, 0.97\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:19:34 compute-1 nova_compute[238822]: 2025-09-30 18:19:34.838 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:19:35 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:19:35 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/293762595' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:35 compute-1 nova_compute[238822]: 2025-09-30 18:19:35.322 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:19:35 compute-1 ceph-mon[75484]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:19:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/293762595' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:35 compute-1 nova_compute[238822]: 2025-09-30 18:19:35.328 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:19:35 compute-1 unix_chkpwd[277185]: password check failed for user (root)
Sep 30 18:19:35 compute-1 sshd-session[277160]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:19:35 compute-1 podman[249638]: time="2025-09-30T18:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:19:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:19:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8816 "" "Go-http-client/1.1"
Sep 30 18:19:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:35.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:35 compute-1 nova_compute[238822]: 2025-09-30 18:19:35.836 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:19:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:36.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:36 compute-1 nova_compute[238822]: 2025-09-30 18:19:36.345 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:19:36 compute-1 nova_compute[238822]: 2025-09-30 18:19:36.345 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.570s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:19:36 compute-1 ceph-mon[75484]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:19:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:36.369 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:65:37 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e6c80bed-2509-4221-8bbf-987d2791d74d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6c80bed-2509-4221-8bbf-987d2791d74d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e767dc28-6aa0-411a-9e2c-d95e473b8f79, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b7aad425-853a-4eec-bc20-e21ec9b6e1a6) old=Port_Binding(mac=['fa:16:3e:30:65:37'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e6c80bed-2509-4221-8bbf-987d2791d74d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6c80bed-2509-4221-8bbf-987d2791d74d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:19:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:36.370 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b7aad425-853a-4eec-bc20-e21ec9b6e1a6 in datapath e6c80bed-2509-4221-8bbf-987d2791d74d updated
Sep 30 18:19:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:36.371 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6c80bed-2509-4221-8bbf-987d2791d74d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:19:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:36.372 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2250dc6c-b471-451a-8a9b-415bde5f44ed]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:19:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:36 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:19:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:36 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:19:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:36 compute-1 nova_compute[238822]: 2025-09-30 18:19:36.834 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:36 compute-1 nova_compute[238822]: 2025-09-30 18:19:36.835 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:37 compute-1 sudo[277189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:19:37 compute-1 sudo[277189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:37 compute-1 sudo[277189]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:37 compute-1 nova_compute[238822]: 2025-09-30 18:19:37.343 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/84034973' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:19:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/84034973' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:19:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:19:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:19:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:19:37 compute-1 nova_compute[238822]: 2025-09-30 18:19:37.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:37.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:37 compute-1 sshd-session[277160]: Failed password for root from 103.153.190.105 port 49558 ssh2
Sep 30 18:19:38 compute-1 nova_compute[238822]: 2025-09-30 18:19:38.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:38 compute-1 nova_compute[238822]: 2025-09-30 18:19:38.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:38.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:38 compute-1 nova_compute[238822]: 2025-09-30 18:19:38.343 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:38 compute-1 ceph-mon[75484]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 234 MiB used, 40 GiB / 40 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:19:38 compute-1 podman[277216]: 2025-09-30 18:19:38.566826281 +0000 UTC m=+0.096214394 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, version=9.6, release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7)
Sep 30 18:19:38 compute-1 podman[277217]: 2025-09-30 18:19:38.571021795 +0000 UTC m=+0.091777335 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:19:38 compute-1 sshd-session[277160]: Received disconnect from 103.153.190.105 port 49558:11: Bye Bye [preauth]
Sep 30 18:19:38 compute-1 sshd-session[277160]: Disconnected from authenticating user root 103.153.190.105 port 49558 [preauth]
Sep 30 18:19:38 compute-1 podman[277215]: 2025-09-30 18:19:38.600096112 +0000 UTC m=+0.132924538 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:19:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:38 compute-1 nova_compute[238822]: 2025-09-30 18:19:38.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:39.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:40 compute-1 nova_compute[238822]: 2025-09-30 18:19:40.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:40 compute-1 nova_compute[238822]: 2025-09-30 18:19:40.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:19:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:40.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:40 compute-1 nova_compute[238822]: 2025-09-30 18:19:40.565 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:19:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:41 compute-1 ceph-mon[75484]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:19:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:41.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:42.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:42 compute-1 nova_compute[238822]: 2025-09-30 18:19:42.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:43 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:43 compute-1 ceph-mon[75484]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:19:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:43.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:43 compute-1 nova_compute[238822]: 2025-09-30 18:19:43.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:44 compute-1 nova_compute[238822]: 2025-09-30 18:19:44.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:19:44 compute-1 nova_compute[238822]: 2025-09-30 18:19:44.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:19:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:44 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac001970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:44.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:44 compute-1 ceph-mon[75484]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:19:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:45 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:45.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/181946 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:19:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:46 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:46.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:47 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:47 compute-1 ceph-mon[75484]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:19:47 compute-1 nova_compute[238822]: 2025-09-30 18:19:47.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:47.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:48 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac002290 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:48.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1814289124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:19:48 compute-1 ceph-mon[75484]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:19:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:48 compute-1 nova_compute[238822]: 2025-09-30 18:19:48.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:49 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: ERROR   18:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: ERROR   18:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: ERROR   18:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: ERROR   18:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: ERROR   18:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:19:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:19:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:49.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:50 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:50.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.765248) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390765293, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2361, "num_deletes": 505, "total_data_size": 5322141, "memory_usage": 5402296, "flush_reason": "Manual Compaction"}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390782024, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2098906, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33679, "largest_seqno": 36035, "table_properties": {"data_size": 2092011, "index_size": 3200, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 20774, "raw_average_key_size": 19, "raw_value_size": 2074751, "raw_average_value_size": 1977, "num_data_blocks": 144, "num_entries": 1049, "num_filter_entries": 1049, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256204, "oldest_key_time": 1759256204, "file_creation_time": 1759256390, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 16837 microseconds, and 9916 cpu microseconds.
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.782083) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2098906 bytes OK
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.782109) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.784208) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.784231) EVENT_LOG_v1 {"time_micros": 1759256390784224, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.784253) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 5310653, prev total WAL file size 5310653, number of live WAL files 2.
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.786481) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303032' seq:72057594037927935, type:22 .. '6D6772737461740031323533' seq:0, type:0; will stop at (end)
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2049KB)], [63(12MB)]
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390786540, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 14704875, "oldest_snapshot_seqno": -1}
Sep 30 18:19:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6106 keys, 12088244 bytes, temperature: kUnknown
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390860597, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 12088244, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12049358, "index_size": 22520, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15301, "raw_key_size": 156871, "raw_average_key_size": 25, "raw_value_size": 11941148, "raw_average_value_size": 1955, "num_data_blocks": 909, "num_entries": 6106, "num_filter_entries": 6106, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256390, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.860943) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 12088244 bytes
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.862479) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.2 rd, 162.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.0 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(12.8) write-amplify(5.8) OK, records in: 7045, records dropped: 939 output_compression: NoCompression
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.862507) EVENT_LOG_v1 {"time_micros": 1759256390862493, "job": 38, "event": "compaction_finished", "compaction_time_micros": 74184, "compaction_time_cpu_micros": 49453, "output_level": 6, "num_output_files": 1, "total_output_size": 12088244, "num_input_records": 7045, "num_output_records": 6106, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390863276, "job": 38, "event": "table_file_deletion", "file_number": 65}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256390867434, "job": 38, "event": "table_file_deletion", "file_number": 63}
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.786389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.867573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.867582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.867587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.867591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:19:50.867595) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:19:50 compute-1 unix_chkpwd[277309]: password check failed for user (root)
Sep 30 18:19:50 compute-1 sshd-session[277305]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:19:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:51 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:51 compute-1 ceph-mon[75484]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:19:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:51.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:52 compute-1 sudo[277310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:19:52 compute-1 sudo[277310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:19:52 compute-1 sudo[277310]: pam_unix(sudo:session): session closed for user root
Sep 30 18:19:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:52 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac002290 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:52.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:19:52 compute-1 nova_compute[238822]: 2025-09-30 18:19:52.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:52 compute-1 sshd-session[277305]: Failed password for root from 192.210.160.141 port 48680 ssh2
Sep 30 18:19:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:52 compute-1 sshd-session[277337]: Invalid user jenkins from 167.71.248.239 port 32778
Sep 30 18:19:52 compute-1 sshd-session[277337]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:19:52 compute-1 sshd-session[277337]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 18:19:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:53 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:53 compute-1 ceph-mon[75484]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 233 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:19:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:53 compute-1 nova_compute[238822]: 2025-09-30 18:19:53.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:53.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:54 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:54 compute-1 sshd-session[277305]: Connection closed by authenticating user root 192.210.160.141 port 48680 [preauth]
Sep 30 18:19:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:54.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:54 compute-1 ceph-mon[75484]: pgmap v1238: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:19:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:54.371 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:19:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:54.371 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:19:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:19:54.372 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:19:54 compute-1 podman[277342]: 2025-09-30 18:19:54.555945028 +0000 UTC m=+0.092107523 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:19:54 compute-1 podman[277341]: 2025-09-30 18:19:54.651107203 +0000 UTC m=+0.191607586 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Sep 30 18:19:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:55 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:55 compute-1 sshd-session[277337]: Failed password for invalid user jenkins from 167.71.248.239 port 32778 ssh2
Sep 30 18:19:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:55.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:56 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2871397625' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:19:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:56.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:57 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a940019e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:57 compute-1 sshd-session[277301]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:19:57 compute-1 sshd-session[277301]: banner exchange: Connection from 113.249.93.94 port 4246: Connection timed out
Sep 30 18:19:57 compute-1 ceph-mon[75484]: pgmap v1239: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:19:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3463357897' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:19:57 compute-1 sshd-session[277337]: Connection closed by invalid user jenkins 167.71.248.239 port 32778 [preauth]
Sep 30 18:19:57 compute-1 nova_compute[238822]: 2025-09-30 18:19:57.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:57.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:58 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3492562645' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:19:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3492562645' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:19:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:19:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:19:58.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:19:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:19:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:58 compute-1 nova_compute[238822]: 2025-09-30 18:19:58.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:19:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:19:59 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:19:59 compute-1 ceph-mon[75484]: pgmap v1240: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:19:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:19:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:19:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:19:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:19:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:19:59.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:19:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:19:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:00 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 18:20:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:00.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:01 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a940019e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:01 compute-1 ceph-mon[75484]: pgmap v1241: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:20:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:01.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:02 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:02.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:02 compute-1 ceph-mon[75484]: pgmap v1242: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:20:02 compute-1 nova_compute[238822]: 2025-09-30 18:20:02.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:02 compute-1 podman[277398]: 2025-09-30 18:20:02.52327937 +0000 UTC m=+0.065688235 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 18:20:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:03 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab80025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:03 compute-1 sshd[170789]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 275901
Sep 30 18:20:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:03 compute-1 nova_compute[238822]: 2025-09-30 18:20:03.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:03.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:04 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:04.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:04 compute-1 unix_chkpwd[277421]: password check failed for user (root)
Sep 30 18:20:04 compute-1 sshd-session[277418]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:20:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:05 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:05 compute-1 ceph-mon[75484]: pgmap v1243: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:20:05 compute-1 podman[249638]: time="2025-09-30T18:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:20:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38442 "" "Go-http-client/1.1"
Sep 30 18:20:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8831 "" "Go-http-client/1.1"
Sep 30 18:20:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:05.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:06 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:06 compute-1 nova_compute[238822]: 2025-09-30 18:20:06.204 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:06 compute-1 nova_compute[238822]: 2025-09-30 18:20:06.205 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:06.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:06 compute-1 ceph-mon[75484]: pgmap v1244: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:20:06 compute-1 nova_compute[238822]: 2025-09-30 18:20:06.711 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:20:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:07 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:07 compute-1 sshd-session[277418]: Failed password for root from 194.107.115.65 port 57510 ssh2
Sep 30 18:20:07 compute-1 nova_compute[238822]: 2025-09-30 18:20:07.273 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:07 compute-1 nova_compute[238822]: 2025-09-30 18:20:07.274 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:07 compute-1 nova_compute[238822]: 2025-09-30 18:20:07.283 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:20:07 compute-1 nova_compute[238822]: 2025-09-30 18:20:07.284 2 INFO nova.compute.claims [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:20:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:20:07 compute-1 nova_compute[238822]: 2025-09-30 18:20:07.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:07 compute-1 sshd-session[277418]: Received disconnect from 194.107.115.65 port 57510:11: Bye Bye [preauth]
Sep 30 18:20:07 compute-1 sshd-session[277418]: Disconnected from authenticating user root 194.107.115.65 port 57510 [preauth]
Sep 30 18:20:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:07.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:08 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:08.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:08 compute-1 nova_compute[238822]: 2025-09-30 18:20:08.370 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:08 compute-1 ceph-mon[75484]: pgmap v1245: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:20:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:08 compute-1 nova_compute[238822]: 2025-09-30 18:20:08.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:20:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4170402657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:08 compute-1 nova_compute[238822]: 2025-09-30 18:20:08.921 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:08 compute-1 nova_compute[238822]: 2025-09-30 18:20:08.927 2 DEBUG nova.compute.provider_tree [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:20:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:09 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:09 compute-1 nova_compute[238822]: 2025-09-30 18:20:09.437 2 DEBUG nova.scheduler.client.report [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:20:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4170402657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:09 compute-1 podman[277451]: 2025-09-30 18:20:09.5654679 +0000 UTC m=+0.089579880 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:20:09 compute-1 podman[277450]: 2025-09-30 18:20:09.582911091 +0000 UTC m=+0.108153942 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Sep 30 18:20:09 compute-1 podman[277449]: 2025-09-30 18:20:09.596464327 +0000 UTC m=+0.129200150 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:20:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:09.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:09 compute-1 nova_compute[238822]: 2025-09-30 18:20:09.947 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.674s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:09 compute-1 nova_compute[238822]: 2025-09-30 18:20:09.948 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:20:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:10 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:10.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:10 compute-1 ceph-mon[75484]: pgmap v1246: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:20:10 compute-1 nova_compute[238822]: 2025-09-30 18:20:10.463 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:20:10 compute-1 nova_compute[238822]: 2025-09-30 18:20:10.464 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:20:10 compute-1 nova_compute[238822]: 2025-09-30 18:20:10.465 2 WARNING neutronclient.v2_0.client [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:10 compute-1 nova_compute[238822]: 2025-09-30 18:20:10.465 2 WARNING neutronclient.v2_0.client [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Sep 30 18:20:10 compute-1 nova_compute[238822]: 2025-09-30 18:20:10.976 2 INFO nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:20:10 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Sep 30 18:20:11 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Sep 30 18:20:11 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Sep 30 18:20:11 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Sep 30 18:20:11 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Sep 30 18:20:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:11 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.205 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Successfully created port: 90c2d4fe-d570-4e9f-b84c-fde6838367d8 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.486 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.766 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Successfully updated port: 90c2d4fe-d570-4e9f-b84c-fde6838367d8 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:20:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:11.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.818 2 DEBUG nova.compute.manager [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-changed-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.819 2 DEBUG nova.compute.manager [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Refreshing instance network info cache due to event network-changed-90c2d4fe-d570-4e9f-b84c-fde6838367d8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.819 2 DEBUG oslo_concurrency.lockutils [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.819 2 DEBUG oslo_concurrency.lockutils [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:20:11 compute-1 nova_compute[238822]: 2025-09-30 18:20:11.820 2 DEBUG nova.network.neutron [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Refreshing network info cache for port 90c2d4fe-d570-4e9f-b84c-fde6838367d8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:20:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:12 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:12 compute-1 sudo[277509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:20:12 compute-1 sudo[277509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:12 compute-1 sudo[277509]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.278 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.321414) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412321585, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 486, "num_deletes": 251, "total_data_size": 572991, "memory_usage": 582696, "flush_reason": "Manual Compaction"}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412328369, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 375774, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36040, "largest_seqno": 36521, "table_properties": {"data_size": 373204, "index_size": 606, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6304, "raw_average_key_size": 18, "raw_value_size": 368112, "raw_average_value_size": 1098, "num_data_blocks": 27, "num_entries": 335, "num_filter_entries": 335, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256391, "oldest_key_time": 1759256391, "file_creation_time": 1759256412, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 7002 microseconds, and 3925 cpu microseconds.
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.328 2 WARNING neutronclient.v2_0.client [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.328443) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 375774 bytes OK
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.328476) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.330767) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.330792) EVENT_LOG_v1 {"time_micros": 1759256412330784, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.330823) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 570070, prev total WAL file size 570070, number of live WAL files 2.
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.331641) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(366KB)], [66(11MB)]
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412331718, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 12464018, "oldest_snapshot_seqno": -1}
Sep 30 18:20:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:12.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 5931 keys, 10451907 bytes, temperature: kUnknown
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412382003, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 10451907, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10415549, "index_size": 20440, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14853, "raw_key_size": 153921, "raw_average_key_size": 25, "raw_value_size": 10311734, "raw_average_value_size": 1738, "num_data_blocks": 815, "num_entries": 5931, "num_filter_entries": 5931, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256412, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.382430) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 10451907 bytes
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.383918) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 247.2 rd, 207.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 11.5 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(61.0) write-amplify(27.8) OK, records in: 6441, records dropped: 510 output_compression: NoCompression
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.383966) EVENT_LOG_v1 {"time_micros": 1759256412383936, "job": 40, "event": "compaction_finished", "compaction_time_micros": 50411, "compaction_time_cpu_micros": 24432, "output_level": 6, "num_output_files": 1, "total_output_size": 10451907, "num_input_records": 6441, "num_output_records": 5931, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412384276, "job": 40, "event": "table_file_deletion", "file_number": 68}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256412388492, "job": 40, "event": "table_file_deletion", "file_number": 66}
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.331474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.388588) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.388597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.388600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.388604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:20:12.388607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.509 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.511 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.511 2 INFO nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Creating image(s)
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.552 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.598 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.638 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.644 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.715 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.716 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.717 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.718 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.758 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:12 compute-1 nova_compute[238822]: 2025-09-30 18:20:12.766 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 cbd42725-3e20-4602-8e30-926ab9bb7865_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:13 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.084 2 DEBUG nova.network.neutron [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.099 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 cbd42725-3e20-4602-8e30-926ab9bb7865_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.333s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.193 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] resizing rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.249 2 DEBUG nova.network.neutron [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.332 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.333 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Ensure instance console log exists: /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.335 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.336 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.337 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:13 compute-1 ceph-mon[75484]: pgmap v1247: 353 pgs: 353 active+clean; 88 MiB data, 254 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:20:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.757 2 DEBUG oslo_concurrency.lockutils [req-ede660e6-31a3-4a73-8fa2-9ca02ff9556d req-d2bd046f-33c3-488b-afb5-7fa7e4ca8da4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.758 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquired lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.759 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:20:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:13 compute-1 nova_compute[238822]: 2025-09-30 18:20:13.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:13.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:14 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:14.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:14 compute-1 ceph-mon[75484]: pgmap v1248: 353 pgs: 353 active+clean; 159 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 3.8 MiB/s wr, 243 op/s
Sep 30 18:20:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:15 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:15 compute-1 nova_compute[238822]: 2025-09-30 18:20:15.102 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:20:15 compute-1 nova_compute[238822]: 2025-09-30 18:20:15.313 2 WARNING neutronclient.v2_0.client [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:15.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:16 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.173 2 DEBUG nova.network.neutron [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Updating instance_info_cache with network_info: [{"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:20:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:16.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:16 compute-1 sshd-session[277706]: Invalid user prueba from 192.210.160.141 port 60790
Sep 30 18:20:16 compute-1 sshd-session[277706]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:16 compute-1 sshd-session[277706]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.681 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Releasing lock "refresh_cache-cbd42725-3e20-4602-8e30-926ab9bb7865" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.681 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance network_info: |[{"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.685 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Start _get_guest_xml network_info=[{"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.691 2 WARNING nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.693 2 DEBUG nova.virt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-326780236', uuid='cbd42725-3e20-4602-8e30-926ab9bb7865'), owner=OwnerMeta(userid='57be6c3d2e0d431dae0127ac659de1e0', username='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin', projectid='af4ef07c582847419a03275af50c6ffc', projectname='tempest-TestExecuteHostMaintenanceStrategy-1597156537'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256416.6935852) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.704 2 DEBUG nova.virt.libvirt.host [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.705 2 DEBUG nova.virt.libvirt.host [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.711 2 DEBUG nova.virt.libvirt.host [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.712 2 DEBUG nova.virt.libvirt.host [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.712 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.713 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.714 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.714 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.714 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.714 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.715 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.715 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.715 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.716 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.716 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.716 2 DEBUG nova.virt.hardware [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:20:16 compute-1 nova_compute[238822]: 2025-09-30 18:20:16.722 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:17 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:20:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3613281835' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.196 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.241 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.248 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:17 compute-1 ceph-mon[75484]: pgmap v1249: 353 pgs: 353 active+clean; 159 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 364 KiB/s rd, 3.8 MiB/s wr, 169 op/s
Sep 30 18:20:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3613281835' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:20:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1623189252' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.717 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.721 2 DEBUG nova.virt.libvirt.vif [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:20:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-326780236',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-326780236',id=13,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-gp2cac14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:20:11Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=cbd42725-3e20-4602-8e30-926ab9bb7865,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.722 2 DEBUG nova.network.os_vif_util [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.724 2 DEBUG nova.network.os_vif_util [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:20:17 compute-1 nova_compute[238822]: 2025-09-30 18:20:17.726 2 DEBUG nova.objects.instance [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'pci_devices' on Instance uuid cbd42725-3e20-4602-8e30-926ab9bb7865 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:20:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:17 compute-1 sshd-session[277709]: Invalid user scpuser from 14.225.167.110 port 48616
Sep 30 18:20:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:20:17 compute-1 sshd-session[277709]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:17.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:20:17 compute-1 sshd-session[277709]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:20:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:18 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.237 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <uuid>cbd42725-3e20-4602-8e30-926ab9bb7865</uuid>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <name>instance-0000000d</name>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-326780236</nova:name>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:20:16</nova:creationTime>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:20:18 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:20:18 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:user uuid="57be6c3d2e0d431dae0127ac659de1e0">tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin</nova:user>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:project uuid="af4ef07c582847419a03275af50c6ffc">tempest-TestExecuteHostMaintenanceStrategy-1597156537</nova:project>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <nova:port uuid="90c2d4fe-d570-4e9f-b84c-fde6838367d8">
Sep 30 18:20:18 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <system>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="serial">cbd42725-3e20-4602-8e30-926ab9bb7865</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="uuid">cbd42725-3e20-4602-8e30-926ab9bb7865</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </system>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <os>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </os>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <features>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </features>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/cbd42725-3e20-4602-8e30-926ab9bb7865_disk">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:20:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:08:03:b0"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <target dev="tap90c2d4fe-d5"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/console.log" append="off"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <video>
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </video>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:20:18 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:20:18 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:20:18 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:20:18 compute-1 nova_compute[238822]: </domain>
Sep 30 18:20:18 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.239 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Preparing to wait for external event network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.239 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.240 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.240 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.243 2 DEBUG nova.virt.libvirt.vif [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:20:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-326780236',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-326780236',id=13,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-gp2cac14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:20:11Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=cbd42725-3e20-4602-8e30-926ab9bb7865,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.244 2 DEBUG nova.network.os_vif_util [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.245 2 DEBUG nova.network.os_vif_util [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.245 2 DEBUG os_vif [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.248 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.249 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '14cc6f8b-1683-5767-b8e6-7f92da1a83a7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.280 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90c2d4fe-d5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap90c2d4fe-d5, col_values=(('qos', UUID('58e9d7cb-88c8-40c6-a74f-cbc3f3179086')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap90c2d4fe-d5, col_values=(('external_ids', {'iface-id': '90c2d4fe-d570-4e9f-b84c-fde6838367d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:03:b0', 'vm-uuid': 'cbd42725-3e20-4602-8e30-926ab9bb7865'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 NetworkManager[45549]: <info>  [1759256418.2845] manager: (tap90c2d4fe-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:18 compute-1 nova_compute[238822]: 2025-09-30 18:20:18.295 2 INFO os_vif [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5')
Sep 30 18:20:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1623189252' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:20:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:18.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:18 compute-1 sshd-session[277706]: Failed password for invalid user prueba from 192.210.160.141 port 60790 ssh2
Sep 30 18:20:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:19 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:19 compute-1 ceph-mon[75484]: pgmap v1250: 353 pgs: 353 active+clean; 159 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 364 KiB/s rd, 3.8 MiB/s wr, 170 op/s
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: ERROR   18:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: ERROR   18:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: ERROR   18:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: ERROR   18:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: ERROR   18:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:20:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:20:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:20:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:19.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:20:19 compute-1 sshd-session[277706]: Connection closed by invalid user prueba 192.210.160.141 port 60790 [preauth]
Sep 30 18:20:19 compute-1 nova_compute[238822]: 2025-09-30 18:20:19.848 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:20:19 compute-1 nova_compute[238822]: 2025-09-30 18:20:19.849 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:20:19 compute-1 nova_compute[238822]: 2025-09-30 18:20:19.849 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No VIF found with MAC fa:16:3e:08:03:b0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:20:19 compute-1 nova_compute[238822]: 2025-09-30 18:20:19.850 2 INFO nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Using config drive
Sep 30 18:20:19 compute-1 nova_compute[238822]: 2025-09-30 18:20:19.885 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:20 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0001930 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:20 compute-1 sshd-session[277709]: Failed password for invalid user scpuser from 14.225.167.110 port 48616 ssh2
Sep 30 18:20:20 compute-1 ceph-mon[75484]: pgmap v1251: 353 pgs: 353 active+clean; 167 MiB data, 305 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 199 op/s
Sep 30 18:20:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:20.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.402 2 WARNING neutronclient.v2_0.client [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.683 2 INFO nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Creating config drive at /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.693 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp998idrxh execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.831 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp998idrxh" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.878 2 DEBUG nova.storage.rbd_utils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:20:20 compute-1 nova_compute[238822]: 2025-09-30 18:20:20.883 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:21 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.089 2 DEBUG oslo_concurrency.processutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config cbd42725-3e20-4602-8e30-926ab9bb7865_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.091 2 INFO nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Deleting local config drive /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865/disk.config because it was imported into RBD.
Sep 30 18:20:21 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:20:21 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:20:21 compute-1 kernel: tap90c2d4fe-d5: entered promiscuous mode
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.2235] manager: (tap90c2d4fe-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Sep 30 18:20:21 compute-1 ovn_controller[135204]: 2025-09-30T18:20:21Z|00099|binding|INFO|Claiming lport 90c2d4fe-d570-4e9f-b84c-fde6838367d8 for this chassis.
Sep 30 18:20:21 compute-1 ovn_controller[135204]: 2025-09-30T18:20:21Z|00100|binding|INFO|90c2d4fe-d570-4e9f-b84c-fde6838367d8: Claiming fa:16:3e:08:03:b0 10.100.0.9
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.247 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:03:b0 10.100.0.9'], port_security=['fa:16:3e:08:03:b0 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbd42725-3e20-4602-8e30-926ab9bb7865', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=90c2d4fe-d570-4e9f-b84c-fde6838367d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.248 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 90c2d4fe-d570-4e9f-b84c-fde6838367d8 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b bound to our chassis
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.250 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.266 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b94f2bca-2fe1-455b-a4c2-b73996516cf3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.267 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap443be7ca-f1 in ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.269 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap443be7ca-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.269 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b6c5f7-305b-48ec-9f03-2896c6494cbb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.271 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b20ae0a7-9573-4afd-b940-51e3100810fb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 systemd-machined[195911]: New machine qemu-8-instance-0000000d.
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.288 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef47e03-90f3-43f0-956b-ad0077e07566]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000d.
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.308 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bce64ec2-7bbb-444f-aba9-61db7fa8824e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 ovn_controller[135204]: 2025-09-30T18:20:21Z|00101|binding|INFO|Setting lport 90c2d4fe-d570-4e9f-b84c-fde6838367d8 ovn-installed in OVS
Sep 30 18:20:21 compute-1 ovn_controller[135204]: 2025-09-30T18:20:21Z|00102|binding|INFO|Setting lport 90c2d4fe-d570-4e9f-b84c-fde6838367d8 up in Southbound
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 systemd-udevd[277873]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.3530] device (tap90c2d4fe-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.3555] device (tap90c2d4fe-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.355 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[f61fd808-1ca5-44d7-b671-40a5e82c0e2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.361 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[efafdcc8-bb88-48f2-9575-c861eb2497bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.3634] manager: (tap443be7ca-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.404 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[fc618735-3e3b-4af2-a1d1-6daabac1e1eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.408 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[44443f95-b1c1-440c-99a2-e5da18a3d85c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.4405] device (tap443be7ca-f0): carrier: link connected
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.451 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3b05f6d9-31d3-49ad-8a02-52252fcca895]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.477 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[580a5f17-2dd6-474c-a50b-4dc1f4e4c781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1426302, 'reachable_time': 20408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277901, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.504 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[740802c6-3f56-4dd9-be68-6f2d58ae0401]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:7f4d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1426302, 'tstamp': 1426302}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277902, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.532 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[23ded6eb-bae8-43d3-aafa-d0fae35746cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1426302, 'reachable_time': 20408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277903, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.581 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[155c7b3e-269e-4560-9b3c-40f50147de09]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 sshd-session[277709]: Received disconnect from 14.225.167.110 port 48616:11: Bye Bye [preauth]
Sep 30 18:20:21 compute-1 sshd-session[277709]: Disconnected from invalid user scpuser 14.225.167.110 port 48616 [preauth]
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.653 2 DEBUG nova.compute.manager [req-e910d74e-c7a7-4154-9982-722a6d663cea req-aec9b35a-365f-4c6b-a400-93dddc7d3469 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.653 2 DEBUG oslo_concurrency.lockutils [req-e910d74e-c7a7-4154-9982-722a6d663cea req-aec9b35a-365f-4c6b-a400-93dddc7d3469 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.654 2 DEBUG oslo_concurrency.lockutils [req-e910d74e-c7a7-4154-9982-722a6d663cea req-aec9b35a-365f-4c6b-a400-93dddc7d3469 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.654 2 DEBUG oslo_concurrency.lockutils [req-e910d74e-c7a7-4154-9982-722a6d663cea req-aec9b35a-365f-4c6b-a400-93dddc7d3469 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.654 2 DEBUG nova.compute.manager [req-e910d74e-c7a7-4154-9982-722a6d663cea req-aec9b35a-365f-4c6b-a400-93dddc7d3469 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Processing event network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.670 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9a67e48f-83fd-441f-a58f-af993b1a9232]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.672 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.672 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.673 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 NetworkManager[45549]: <info>  [1759256421.6765] manager: (tap443be7ca-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Sep 30 18:20:21 compute-1 kernel: tap443be7ca-f0: entered promiscuous mode
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.681 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 ovn_controller[135204]: 2025-09-30T18:20:21Z|00103|binding|INFO|Releasing lport 031d2cff-b142-4423-ba99-772183b7a667 from this chassis (sb_readonly=0)
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.712 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 nova_compute[238822]: 2025-09-30 18:20:21.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.723 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8771fb-b25e-41f5-af83-aceecd79d12b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.724 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.725 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.725 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 443be7ca-f628-4a45-95b6-620d37172d7b disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.725 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.726 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae844e4-d109-4161-956b-4d1fd3f46a75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.727 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.727 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[18d261ae-95e9-4256-b14b-2c0cf8e4a2a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.728 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:20:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:21.729 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'env', 'PROCESS_TAG=haproxy-443be7ca-f628-4a45-95b6-620d37172d7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/443be7ca-f628-4a45-95b6-620d37172d7b.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:20:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:21.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:22 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:22 compute-1 podman[277980]: 2025-09-30 18:20:22.200572507 +0000 UTC m=+0.046155837 container create d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 18:20:22 compute-1 systemd[1]: Started libpod-conmon-d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6.scope.
Sep 30 18:20:22 compute-1 podman[277980]: 2025-09-30 18:20:22.175241403 +0000 UTC m=+0.020824743 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:20:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:20:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f68484b35ca4d985713451cb30c5168888a5cb45ddb6d6ec9894fbb717056a77/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:20:22 compute-1 podman[277980]: 2025-09-30 18:20:22.33247902 +0000 UTC m=+0.178062430 container init d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 18:20:22 compute-1 podman[277980]: 2025-09-30 18:20:22.343903568 +0000 UTC m=+0.189486898 container start d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 18:20:22 compute-1 ceph-mon[75484]: pgmap v1252: 353 pgs: 353 active+clean; 167 MiB data, 305 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 199 op/s
Sep 30 18:20:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:20:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:20:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:22.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:20:22 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [NOTICE]   (278000) : New worker (278002) forked
Sep 30 18:20:22 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [NOTICE]   (278000) : Loading success.
Sep 30 18:20:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:22.426 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:20:22 compute-1 nova_compute[238822]: 2025-09-30 18:20:22.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:22 compute-1 nova_compute[238822]: 2025-09-30 18:20:22.611 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:20:22 compute-1 nova_compute[238822]: 2025-09-30 18:20:22.616 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:20:22 compute-1 nova_compute[238822]: 2025-09-30 18:20:22.619 2 INFO nova.virt.libvirt.driver [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance spawned successfully.
Sep 30 18:20:22 compute-1 nova_compute[238822]: 2025-09-30 18:20:22.620 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:20:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:22 compute-1 sshd-session[277910]: Invalid user work from 175.126.165.170 port 46384
Sep 30 18:20:22 compute-1 sshd-session[277910]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:22 compute-1 sshd-session[277910]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:20:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:23 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.137 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.138 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.139 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.140 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.141 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.142 2 DEBUG nova.virt.libvirt.driver [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.656 2 INFO nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Took 11.15 seconds to spawn the instance on the hypervisor.
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.657 2 DEBUG nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:20:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.723 2 DEBUG nova.compute.manager [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.724 2 DEBUG oslo_concurrency.lockutils [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.725 2 DEBUG oslo_concurrency.lockutils [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.725 2 DEBUG oslo_concurrency.lockutils [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.726 2 DEBUG nova.compute.manager [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] No waiting events found dispatching network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:20:23 compute-1 nova_compute[238822]: 2025-09-30 18:20:23.727 2 WARNING nova.compute.manager [req-0b8166a0-5c84-459e-835c-c7dcc7eafee5 req-8aaf36e8-fa70-408c-8046-a493b9121db0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received unexpected event network-vif-plugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 for instance with vm_state building and task_state spawning.
Sep 30 18:20:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:20:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:23.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:20:23 compute-1 sshd-session[278012]: Invalid user test from 84.51.43.58 port 57550
Sep 30 18:20:23 compute-1 sshd-session[278012]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:23 compute-1 sshd-session[278012]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:20:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:24 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0001930 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:24 compute-1 nova_compute[238822]: 2025-09-30 18:20:24.203 2 INFO nova.compute.manager [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Took 16.98 seconds to build instance.
Sep 30 18:20:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:24.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:24 compute-1 nova_compute[238822]: 2025-09-30 18:20:24.709 2 DEBUG oslo_concurrency.lockutils [None req-22194c89-8ff7-4ca9-b5ec-cbf079bd49d4 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.505s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:25 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:25 compute-1 sshd-session[277910]: Failed password for invalid user work from 175.126.165.170 port 46384 ssh2
Sep 30 18:20:25 compute-1 ceph-mon[75484]: pgmap v1253: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 684 KiB/s rd, 3.9 MiB/s wr, 219 op/s
Sep 30 18:20:25 compute-1 podman[278018]: 2025-09-30 18:20:25.543090253 +0000 UTC m=+0.082218872 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:20:25 compute-1 podman[278017]: 2025-09-30 18:20:25.59627778 +0000 UTC m=+0.138424140 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:20:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:25.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:26 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:26.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:26 compute-1 ceph-mon[75484]: pgmap v1254: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 320 KiB/s rd, 112 KiB/s wr, 49 op/s
Sep 30 18:20:26 compute-1 sshd-session[278012]: Failed password for invalid user test from 84.51.43.58 port 57550 ssh2
Sep 30 18:20:26 compute-1 nova_compute[238822]: 2025-09-30 18:20:26.570 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:26 compute-1 nova_compute[238822]: 2025-09-30 18:20:26.571 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:20:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:27 compute-1 nova_compute[238822]: 2025-09-30 18:20:27.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:27 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:27 compute-1 sshd-session[277910]: Received disconnect from 175.126.165.170 port 46384:11: Bye Bye [preauth]
Sep 30 18:20:27 compute-1 sshd-session[277910]: Disconnected from invalid user work 175.126.165.170 port 46384 [preauth]
Sep 30 18:20:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:27.428 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:27 compute-1 nova_compute[238822]: 2025-09-30 18:20:27.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:27.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:28 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0001930 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:28 compute-1 nova_compute[238822]: 2025-09-30 18:20:28.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:28.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:28 compute-1 sshd-session[278067]: Invalid user bpm from 216.10.242.161 port 48816
Sep 30 18:20:28 compute-1 sshd-session[278067]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:28 compute-1 sshd-session[278067]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:20:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:28 compute-1 sshd-session[278012]: Received disconnect from 84.51.43.58 port 57550:11: Bye Bye [preauth]
Sep 30 18:20:28 compute-1 sshd-session[278012]: Disconnected from invalid user test 84.51.43.58 port 57550 [preauth]
Sep 30 18:20:29 compute-1 nova_compute[238822]: 2025-09-30 18:20:29.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:29 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:29 compute-1 ceph-mon[75484]: pgmap v1255: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 320 KiB/s rd, 112 KiB/s wr, 49 op/s
Sep 30 18:20:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:29.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:30 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:30.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:30 compute-1 ceph-mon[75484]: pgmap v1256: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 112 KiB/s wr, 103 op/s
Sep 30 18:20:30 compute-1 sshd-session[278067]: Failed password for invalid user bpm from 216.10.242.161 port 48816 ssh2
Sep 30 18:20:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:31 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:31 compute-1 sshd-session[278067]: Received disconnect from 216.10.242.161 port 48816:11: Bye Bye [preauth]
Sep 30 18:20:31 compute-1 sshd-session[278067]: Disconnected from invalid user bpm 216.10.242.161 port 48816 [preauth]
Sep 30 18:20:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:31.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:32 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0002da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:32 compute-1 sudo[278074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:20:32 compute-1 sudo[278074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:32 compute-1 sudo[278074]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:32.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:32 compute-1 nova_compute[238822]: 2025-09-30 18:20:32.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:33 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:33 compute-1 ceph-mon[75484]: pgmap v1257: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 74 op/s
Sep 30 18:20:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1890354514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:33 compute-1 nova_compute[238822]: 2025-09-30 18:20:33.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:33 compute-1 podman[278100]: 2025-09-30 18:20:33.552467554 +0000 UTC m=+0.085730786 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:20:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:33.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:34 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:34 compute-1 ceph-mon[75484]: pgmap v1258: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 75 op/s
Sep 30 18:20:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:34.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.577 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:20:34 compute-1 nova_compute[238822]: 2025-09-30 18:20:34.577 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:34 compute-1 ovn_controller[135204]: 2025-09-30T18:20:34Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:08:03:b0 10.100.0.9
Sep 30 18:20:34 compute-1 ovn_controller[135204]: 2025-09-30T18:20:34Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:08:03:b0 10.100.0.9
Sep 30 18:20:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:35 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:20:35 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4144592548' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:35 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:35 compute-1 nova_compute[238822]: 2025-09-30 18:20:35.104 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4144592548' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:35 compute-1 podman[249638]: time="2025-09-30T18:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:20:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39666 "" "Go-http-client/1.1"
Sep 30 18:20:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9294 "" "Go-http-client/1.1"
Sep 30 18:20:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:35.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:36 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0002da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.155 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.155 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:20:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/477551565' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:36 compute-1 ceph-mon[75484]: pgmap v1259: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 1.6 MiB/s rd, 1023 B/s wr, 54 op/s
Sep 30 18:20:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:36.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.397 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.399 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.430 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.431 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4535MB free_disk=39.92576599121094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.432 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:36 compute-1 nova_compute[238822]: 2025-09-30 18:20:36.433 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:37 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:20:37 compute-1 sudo[278148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:20:37 compute-1 sudo[278148]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:37 compute-1 sudo[278148]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:37 compute-1 nova_compute[238822]: 2025-09-30 18:20:37.512 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance cbd42725-3e20-4602-8e30-926ab9bb7865 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:20:37 compute-1 sudo[278173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:20:37 compute-1 sudo[278173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:37 compute-1 nova_compute[238822]: 2025-09-30 18:20:37.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:37.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.025 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance cbbd84c1-d174-40d7-be54-3123704f0e0b has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.026 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.026 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:20:36 up  3:57,  0 user,  load average: 0.56, 0.60, 0.95\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_af4ef07c582847419a03275af50c6ffc': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.068 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.109 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.110 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:20:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:38 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.123 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.140 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.202 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:20:38 compute-1 sudo[278173]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:38.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:20:38 compute-1 ceph-mon[75484]: pgmap v1260: 353 pgs: 353 active+clean; 167 MiB data, 309 MiB used, 40 GiB / 40 GiB avail; 1.6 MiB/s rd, 1023 B/s wr, 54 op/s
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:20:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:20:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:20:38 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2844711224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.705 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.714 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.772 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Creating tmpfile /var/lib/nova/instances/tmpczj9za1h to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.774 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:38 compute-1 nova_compute[238822]: 2025-09-30 18:20:38.779 2 DEBUG nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpczj9za1h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:20:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:39 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:39 compute-1 nova_compute[238822]: 2025-09-30 18:20:39.223 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:20:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2844711224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:20:39 compute-1 nova_compute[238822]: 2025-09-30 18:20:39.734 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:20:39 compute-1 nova_compute[238822]: 2025-09-30 18:20:39.735 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.302s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:39.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:40 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0002da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:40.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:40 compute-1 ceph-mon[75484]: pgmap v1261: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 2.1 MiB/s wr, 115 op/s
Sep 30 18:20:40 compute-1 podman[278259]: 2025-09-30 18:20:40.587441397 +0000 UTC m=+0.109448617 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4)
Sep 30 18:20:40 compute-1 podman[278261]: 2025-09-30 18:20:40.59238249 +0000 UTC m=+0.105045988 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 18:20:40 compute-1 podman[278260]: 2025-09-30 18:20:40.604174369 +0000 UTC m=+0.124680209 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git)
Sep 30 18:20:40 compute-1 nova_compute[238822]: 2025-09-30 18:20:40.730 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:40 compute-1 nova_compute[238822]: 2025-09-30 18:20:40.731 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:40 compute-1 nova_compute[238822]: 2025-09-30 18:20:40.731 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:40 compute-1 nova_compute[238822]: 2025-09-30 18:20:40.731 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:20:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:40 compute-1 nova_compute[238822]: 2025-09-30 18:20:40.806 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:41 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:41 compute-1 unix_chkpwd[278320]: password check failed for user (root)
Sep 30 18:20:41 compute-1 sshd-session[278256]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:20:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:41.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:42 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a8c003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:42.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:42 compute-1 nova_compute[238822]: 2025-09-30 18:20:42.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:43 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:43 compute-1 ceph-mon[75484]: pgmap v1262: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:20:43 compute-1 nova_compute[238822]: 2025-09-30 18:20:43.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:43 compute-1 sudo[278323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:20:43 compute-1 sudo[278323]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:43 compute-1 sudo[278323]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:43.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:43 compute-1 sshd-session[278256]: Failed password for root from 192.210.160.141 port 53466 ssh2
Sep 30 18:20:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:44 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab0003ea0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:44 compute-1 sshd-session[278256]: Connection closed by authenticating user root 192.210.160.141 port 53466 [preauth]
Sep 30 18:20:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:44.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:20:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:20:44 compute-1 ceph-mon[75484]: pgmap v1263: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Sep 30 18:20:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:44 compute-1 nova_compute[238822]: 2025-09-30 18:20:44.807 2 DEBUG nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpczj9za1h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='cbbd84c1-d174-40d7-be54-3123704f0e0b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:20:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:45 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:45 compute-1 nova_compute[238822]: 2025-09-30 18:20:45.823 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:20:45 compute-1 nova_compute[238822]: 2025-09-30 18:20:45.824 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:20:45 compute-1 nova_compute[238822]: 2025-09-30 18:20:45.824 2 DEBUG nova.network.neutron [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:20:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:45.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:46 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5aac001d90 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:46 compute-1 nova_compute[238822]: 2025-09-30 18:20:46.330 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:46.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:47 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:47 compute-1 ceph-mon[75484]: pgmap v1264: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:20:47 compute-1 nova_compute[238822]: 2025-09-30 18:20:47.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:47.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.082 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:48 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.245 2 DEBUG nova.network.neutron [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Updating instance_info_cache with network_info: [{"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 ceph-mon[75484]: pgmap v1265: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:20:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:48.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.752 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.767 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpczj9za1h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='cbbd84c1-d174-40d7-be54-3123704f0e0b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.767 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Creating instance directory: /var/lib/nova/instances/cbbd84c1-d174-40d7-be54-3123704f0e0b pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.767 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Ensure instance console log exists: /var/lib/nova/instances/cbbd84c1-d174-40d7-be54-3123704f0e0b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.768 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.768 2 DEBUG nova.virt.libvirt.vif [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1407640112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1407640112',id=12,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:20:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-mj5oq7ov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:20:02Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=cbbd84c1-d174-40d7-be54-3123704f0e0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.769 2 DEBUG nova.network.os_vif_util [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.769 2 DEBUG nova.network.os_vif_util [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.770 2 DEBUG os_vif [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.770 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '05e3604c-43d4-556c-880d-c45ed32da082', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca4e45b7-a4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.778 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapca4e45b7-a4, col_values=(('qos', UUID('29bc35bd-25eb-4f6b-b8af-b5521e8eb12b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.778 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapca4e45b7-a4, col_values=(('external_ids', {'iface-id': 'ca4e45b7-a42b-4e47-80d8-749194caf98a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:6b:bd', 'vm-uuid': 'cbbd84c1-d174-40d7-be54-3123704f0e0b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:20:48 compute-1 NetworkManager[45549]: <info>  [1759256448.7811] manager: (tapca4e45b7-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.792 2 INFO os_vif [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4')
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.792 2 DEBUG nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.792 2 DEBUG nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpczj9za1h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='cbbd84c1-d174-40d7-be54-3123704f0e0b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.793 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:48 compute-1 nova_compute[238822]: 2025-09-30 18:20:48.881 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:49 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a88000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: ERROR   18:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: ERROR   18:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: ERROR   18:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: ERROR   18:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: ERROR   18:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:20:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:20:49 compute-1 nova_compute[238822]: 2025-09-30 18:20:49.439 2 DEBUG nova.network.neutron [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Port ca4e45b7-a42b-4e47-80d8-749194caf98a updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:20:49 compute-1 nova_compute[238822]: 2025-09-30 18:20:49.454 2 DEBUG nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpczj9za1h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='cbbd84c1-d174-40d7-be54-3123704f0e0b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:20:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:50 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:20:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:50.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:20:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:51 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:51 compute-1 ceph-mon[75484]: pgmap v1266: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:20:51 compute-1 ovn_controller[135204]: 2025-09-30T18:20:51Z|00104|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Sep 30 18:20:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:20:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:20:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:52 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:52 compute-1 sshd-session[278361]: Invalid user sanjay from 167.172.43.167 port 55246
Sep 30 18:20:52 compute-1 sshd-session[278361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:20:52 compute-1 sshd-session[278361]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:20:52 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:20:52 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:20:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:20:52 compute-1 sudo[278365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:20:52 compute-1 sudo[278365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:20:52 compute-1 sudo[278365]: pam_unix(sudo:session): session closed for user root
Sep 30 18:20:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:52.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:52 compute-1 kernel: tapca4e45b7-a4: entered promiscuous mode
Sep 30 18:20:52 compute-1 NetworkManager[45549]: <info>  [1759256452.4618] manager: (tapca4e45b7-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Sep 30 18:20:52 compute-1 systemd-udevd[278417]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:20:52 compute-1 ovn_controller[135204]: 2025-09-30T18:20:52Z|00105|binding|INFO|Claiming lport ca4e45b7-a42b-4e47-80d8-749194caf98a for this additional chassis.
Sep 30 18:20:52 compute-1 ovn_controller[135204]: 2025-09-30T18:20:52Z|00106|binding|INFO|ca4e45b7-a42b-4e47-80d8-749194caf98a: Claiming fa:16:3e:7e:6b:bd 10.100.0.7
Sep 30 18:20:52 compute-1 nova_compute[238822]: 2025-09-30 18:20:52.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.544 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:6b:bd 10.100.0.7'], port_security=['fa:16:3e:7e:6b:bd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'cbbd84c1-d174-40d7-be54-3123704f0e0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '10', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ca4e45b7-a42b-4e47-80d8-749194caf98a) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.545 144543 INFO neutron.agent.ovn.metadata.agent [-] Port ca4e45b7-a42b-4e47-80d8-749194caf98a in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.547 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:20:52 compute-1 NetworkManager[45549]: <info>  [1759256452.5500] device (tapca4e45b7-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:20:52 compute-1 NetworkManager[45549]: <info>  [1759256452.5521] device (tapca4e45b7-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:20:52 compute-1 ovn_controller[135204]: 2025-09-30T18:20:52Z|00107|binding|INFO|Setting lport ca4e45b7-a42b-4e47-80d8-749194caf98a ovn-installed in OVS
Sep 30 18:20:52 compute-1 nova_compute[238822]: 2025-09-30 18:20:52.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:52 compute-1 systemd-machined[195911]: New machine qemu-9-instance-0000000c.
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.568 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcee377-390d-4d67-b29e-00a188046a84]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.607 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[92e1f00c-ed37-4574-8c2a-b947007c0d33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.611 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[5d30e115-fe25-4da3-833f-a75abeb942ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 nova_compute[238822]: 2025-09-30 18:20:52.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.659 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[cd57638f-b409-4531-ab4c-e3e927b82e0b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.685 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[60fbea16-4b0d-4d29-9588-4d7142543352]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1426302, 'reachable_time': 20408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278433, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.702 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[32a961ec-2921-4b2d-904b-acdbf6cd5085]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1426319, 'tstamp': 1426319}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278435, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1426324, 'tstamp': 1426324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278435, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.704 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:52 compute-1 nova_compute[238822]: 2025-09-30 18:20:52.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.709 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:52 compute-1 nova_compute[238822]: 2025-09-30 18:20:52.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.709 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.710 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.710 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:20:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:52.711 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0208a4ef-7069-4497-aa0f-f695b9a01878]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:20:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:53 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:53 compute-1 ceph-mon[75484]: pgmap v1267: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 14 KiB/s wr, 1 op/s
Sep 30 18:20:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:53 compute-1 nova_compute[238822]: 2025-09-30 18:20:53.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:53.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:54 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:54 compute-1 sshd-session[278361]: Failed password for invalid user sanjay from 167.172.43.167 port 55246 ssh2
Sep 30 18:20:54 compute-1 ceph-mon[75484]: pgmap v1268: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 14 KiB/s wr, 6 op/s
Sep 30 18:20:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:54.373 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:54.373 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:20:54.374 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:20:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:54.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:20:54 compute-1 sshd-session[278361]: Received disconnect from 167.172.43.167 port 55246:11: Bye Bye [preauth]
Sep 30 18:20:54 compute-1 sshd-session[278361]: Disconnected from invalid user sanjay 167.172.43.167 port 55246 [preauth]
Sep 30 18:20:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:55 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:55 compute-1 ovn_controller[135204]: 2025-09-30T18:20:55Z|00108|binding|INFO|Claiming lport ca4e45b7-a42b-4e47-80d8-749194caf98a for this chassis.
Sep 30 18:20:55 compute-1 ovn_controller[135204]: 2025-09-30T18:20:55Z|00109|binding|INFO|ca4e45b7-a42b-4e47-80d8-749194caf98a: Claiming fa:16:3e:7e:6b:bd 10.100.0.7
Sep 30 18:20:55 compute-1 ovn_controller[135204]: 2025-09-30T18:20:55Z|00110|binding|INFO|Setting lport ca4e45b7-a42b-4e47-80d8-749194caf98a up in Southbound
Sep 30 18:20:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:55.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:56 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:56.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:56 compute-1 nova_compute[238822]: 2025-09-30 18:20:56.594 2 INFO nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Post operation of migration started
Sep 30 18:20:56 compute-1 nova_compute[238822]: 2025-09-30 18:20:56.595 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:56 compute-1 podman[278483]: 2025-09-30 18:20:56.59668352 +0000 UTC m=+0.139727455 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Sep 30 18:20:56 compute-1 podman[278484]: 2025-09-30 18:20:56.597773819 +0000 UTC m=+0.127037882 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:20:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:57 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.126 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.127 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.223 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.223 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.223 2 DEBUG nova.network.neutron [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:20:57 compute-1 ceph-mon[75484]: pgmap v1269: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.4 KiB/s rd, 2.1 KiB/s wr, 5 op/s
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:57 compute-1 nova_compute[238822]: 2025-09-30 18:20:57.732 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:57.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:58 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2873177461' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:20:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2873177461' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:20:58 compute-1 ceph-mon[75484]: pgmap v1270: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.4 KiB/s rd, 2.1 KiB/s wr, 5 op/s
Sep 30 18:20:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:20:58.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:20:58 compute-1 nova_compute[238822]: 2025-09-30 18:20:58.448 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:20:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:20:58 compute-1 nova_compute[238822]: 2025-09-30 18:20:58.679 2 DEBUG nova.network.neutron [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Updating instance_info_cache with network_info: [{"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:20:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:58 compute-1 nova_compute[238822]: 2025-09-30 18:20:58.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:20:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:20:59 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:20:59 compute-1 nova_compute[238822]: 2025-09-30 18:20:59.193 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-cbbd84c1-d174-40d7-be54-3123704f0e0b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:20:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:20:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:20:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:20:59 compute-1 nova_compute[238822]: 2025-09-30 18:20:59.832 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:20:59 compute-1 nova_compute[238822]: 2025-09-30 18:20:59.833 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:20:59 compute-1 nova_compute[238822]: 2025-09-30 18:20:59.833 2 DEBUG oslo_concurrency.lockutils [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:20:59 compute-1 nova_compute[238822]: 2025-09-30 18:20:59.840 2 INFO nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:20:59 compute-1 virtqemud[239124]: Domain id=9 name='instance-0000000c' uuid=cbbd84c1-d174-40d7-be54-3123704f0e0b is tainted: custom-monitor
Sep 30 18:20:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:20:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:20:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:20:59.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:00 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:00.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:00 compute-1 nova_compute[238822]: 2025-09-30 18:21:00.852 2 INFO nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:21:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:01 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:01 compute-1 ceph-mon[75484]: pgmap v1271: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 3.1 KiB/s wr, 6 op/s
Sep 30 18:21:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:01.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:01 compute-1 nova_compute[238822]: 2025-09-30 18:21:01.861 2 INFO nova.virt.libvirt.driver [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:21:01 compute-1 nova_compute[238822]: 2025-09-30 18:21:01.867 2 DEBUG nova.compute.manager [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:21:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:02 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:02 compute-1 ceph-mon[75484]: pgmap v1272: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Sep 30 18:21:02 compute-1 nova_compute[238822]: 2025-09-30 18:21:02.379 2 DEBUG nova.objects.instance [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:21:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:02.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:02 compute-1 nova_compute[238822]: 2025-09-30 18:21:02.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:02 compute-1 sshd-session[278414]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:21:02 compute-1 sshd-session[278414]: banner exchange: Connection from 113.249.93.94 port 18654: Connection timed out
Sep 30 18:21:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:03 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc002010 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:03 compute-1 nova_compute[238822]: 2025-09-30 18:21:03.406 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:03 compute-1 nova_compute[238822]: 2025-09-30 18:21:03.516 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:03 compute-1 nova_compute[238822]: 2025-09-30 18:21:03.516 2 WARNING neutronclient.v2_0.client [None req-5ff3dc87-0e66-4db3-a7ee-76ad4b78a511 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:03 compute-1 nova_compute[238822]: 2025-09-30 18:21:03.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:04 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:04 compute-1 ceph-mon[75484]: pgmap v1273: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 5.1 KiB/s rd, 8.8 KiB/s wr, 7 op/s
Sep 30 18:21:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:04.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:04 compute-1 podman[278539]: 2025-09-30 18:21:04.535315489 +0000 UTC m=+0.074482403 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:21:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:05 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:05 compute-1 podman[249638]: time="2025-09-30T18:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:21:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/565731300' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39666 "" "Go-http-client/1.1"
Sep 30 18:21:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9298 "" "Go-http-client/1.1"
Sep 30 18:21:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:05.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:06 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:06.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:06 compute-1 ceph-mon[75484]: pgmap v1274: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 8.7 KiB/s wr, 2 op/s
Sep 30 18:21:06 compute-1 unix_chkpwd[278563]: password check failed for user (root)
Sep 30 18:21:06 compute-1 sshd-session[278560]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:21:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:07 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc0095a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:21:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:07.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.933 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.934 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.934 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.934 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.934 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:07 compute-1 nova_compute[238822]: 2025-09-30 18:21:07.946 2 INFO nova.compute.manager [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Terminating instance
Sep 30 18:21:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:08 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a88002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:08.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:08 compute-1 sshd-session[278560]: Failed password for root from 192.210.160.141 port 47630 ssh2
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.463 2 DEBUG nova.compute.manager [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:21:08 compute-1 kernel: tap90c2d4fe-d5 (unregistering): left promiscuous mode
Sep 30 18:21:08 compute-1 NetworkManager[45549]: <info>  [1759256468.5275] device (tap90c2d4fe-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 ovn_controller[135204]: 2025-09-30T18:21:08Z|00111|binding|INFO|Releasing lport 90c2d4fe-d570-4e9f-b84c-fde6838367d8 from this chassis (sb_readonly=0)
Sep 30 18:21:08 compute-1 ovn_controller[135204]: 2025-09-30T18:21:08Z|00112|binding|INFO|Setting lport 90c2d4fe-d570-4e9f-b84c-fde6838367d8 down in Southbound
Sep 30 18:21:08 compute-1 ovn_controller[135204]: 2025-09-30T18:21:08Z|00113|binding|INFO|Removing iface tap90c2d4fe-d5 ovn-installed in OVS
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.551 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:03:b0 10.100.0.9'], port_security=['fa:16:3e:08:03:b0 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbd42725-3e20-4602-8e30-926ab9bb7865', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=90c2d4fe-d570-4e9f-b84c-fde6838367d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.553 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 90c2d4fe-d570-4e9f-b84c-fde6838367d8 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.557 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.587 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[067b6b42-c7f8-4bff-95fa-6a9ebcbd0491]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Sep 30 18:21:08 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Consumed 15.594s CPU time.
Sep 30 18:21:08 compute-1 systemd-machined[195911]: Machine qemu-8-instance-0000000d terminated.
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.635 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ef284c7f-97cb-4385-8600-8f168ce85c73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.641 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d80bbf08-33dd-4808-b065-81acaa4fc678]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2676924550' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:08 compute-1 ceph-mon[75484]: pgmap v1275: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 8.7 KiB/s wr, 2 op/s
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.687 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7abffcdf-87e8-43aa-b1b4-df7f23a26552]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.708 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e6e694-d91b-4f4c-a747-38202b29c15d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1426302, 'reachable_time': 20408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278581, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.711 2 INFO nova.virt.libvirt.driver [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Instance destroyed successfully.
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.712 2 DEBUG nova.objects.instance [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'resources' on Instance uuid cbd42725-3e20-4602-8e30-926ab9bb7865 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.735 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2e38d1-4b0d-4a4a-abba-c87542d498da]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1426319, 'tstamp': 1426319}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278590, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1426324, 'tstamp': 1426324}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278590, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.738 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.747 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.747 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.748 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.748 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:21:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:08.750 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[09c608d6-e216-4317-983f-ea4f9f950079]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:08 compute-1 nova_compute[238822]: 2025-09-30 18:21:08.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:09 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004340 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.220 2 DEBUG nova.virt.libvirt.vif [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:20:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-326780236',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-326780236',id=13,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:20:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-gp2cac14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:20:23Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=cbd42725-3e20-4602-8e30-926ab9bb7865,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.220 2 DEBUG nova.network.os_vif_util [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "address": "fa:16:3e:08:03:b0", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90c2d4fe-d5", "ovs_interfaceid": "90c2d4fe-d570-4e9f-b84c-fde6838367d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.221 2 DEBUG nova.network.os_vif_util [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.222 2 DEBUG os_vif [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.225 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90c2d4fe-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=58e9d7cb-88c8-40c6-a74f-cbc3f3179086) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.238 2 DEBUG nova.compute.manager [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.238 2 DEBUG oslo_concurrency.lockutils [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.239 2 DEBUG oslo_concurrency.lockutils [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.239 2 DEBUG oslo_concurrency.lockutils [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.239 2 DEBUG nova.compute.manager [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] No waiting events found dispatching network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.240 2 DEBUG nova.compute.manager [req-ee2c2fbc-e6fa-43ab-bdc4-1d21beaf8ff4 req-f4d9f334-81d0-4a27-bbca-08392c1a10ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.241 2 INFO os_vif [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=90c2d4fe-d570-4e9f-b84c-fde6838367d8,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90c2d4fe-d5')
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.717 2 INFO nova.virt.libvirt.driver [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Deleting instance files /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865_del
Sep 30 18:21:09 compute-1 nova_compute[238822]: 2025-09-30 18:21:09.718 2 INFO nova.virt.libvirt.driver [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Deletion of /var/lib/nova/instances/cbd42725-3e20-4602-8e30-926ab9bb7865_del complete
Sep 30 18:21:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:09 compute-1 sshd-session[278560]: Connection closed by authenticating user root 192.210.160.141 port 47630 [preauth]
Sep 30 18:21:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:10 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:10 compute-1 nova_compute[238822]: 2025-09-30 18:21:10.235 2 INFO nova.compute.manager [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Took 1.77 seconds to destroy the instance on the hypervisor.
Sep 30 18:21:10 compute-1 nova_compute[238822]: 2025-09-30 18:21:10.237 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:21:10 compute-1 nova_compute[238822]: 2025-09-30 18:21:10.237 2 DEBUG nova.compute.manager [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:21:10 compute-1 nova_compute[238822]: 2025-09-30 18:21:10.237 2 DEBUG nova.network.neutron [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:21:10 compute-1 nova_compute[238822]: 2025-09-30 18:21:10.238 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:10 compute-1 ceph-mon[75484]: pgmap v1276: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 11 KiB/s wr, 2 op/s
Sep 30 18:21:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:11 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc0095a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.132 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.282 2 DEBUG nova.compute.manager [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.282 2 DEBUG oslo_concurrency.lockutils [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.283 2 DEBUG oslo_concurrency.lockutils [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.283 2 DEBUG oslo_concurrency.lockutils [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.283 2 DEBUG nova.compute.manager [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] No waiting events found dispatching network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:21:11 compute-1 nova_compute[238822]: 2025-09-30 18:21:11.283 2 DEBUG nova.compute.manager [req-df057eb1-2703-4e55-a414-fcc64a76a0fe req-db17c892-cb37-453a-bee2-180eb129655f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-unplugged-90c2d4fe-d570-4e9f-b84c-fde6838367d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:21:11 compute-1 podman[278617]: 2025-09-30 18:21:11.572603244 +0000 UTC m=+0.099464557 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest)
Sep 30 18:21:11 compute-1 podman[278615]: 2025-09-30 18:21:11.573084277 +0000 UTC m=+0.108813400 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 18:21:11 compute-1 podman[278616]: 2025-09-30 18:21:11.586237632 +0000 UTC m=+0.112423097 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:21:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:11.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:12 compute-1 nova_compute[238822]: 2025-09-30 18:21:12.041 2 DEBUG nova.network.neutron [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:21:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:12 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a88002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:12.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:12 compute-1 sudo[278677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:21:12 compute-1 sudo[278677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:12 compute-1 sudo[278677]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:12 compute-1 nova_compute[238822]: 2025-09-30 18:21:12.548 2 INFO nova.compute.manager [-] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Took 2.31 seconds to deallocate network for instance.
Sep 30 18:21:12 compute-1 nova_compute[238822]: 2025-09-30 18:21:12.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.074 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.075 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:13 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004360 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.148 2 DEBUG oslo_concurrency.processutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:21:13 compute-1 ceph-mon[75484]: pgmap v1277: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 10 KiB/s wr, 2 op/s
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.360 2 DEBUG nova.compute.manager [req-ca3816ea-3714-42c9-a918-ba53e1a53d7f req-7e63ed47-34f7-4d95-b93b-bac1edb2ad0f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbd42725-3e20-4602-8e30-926ab9bb7865] Received event network-vif-deleted-90c2d4fe-d570-4e9f-b84c-fde6838367d8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:21:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2323369020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.658 2 DEBUG oslo_concurrency.processutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:21:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:13 compute-1 nova_compute[238822]: 2025-09-30 18:21:13.668 2 DEBUG nova.compute.provider_tree [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:21:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:13 compute-1 sshd-session[278703]: Invalid user trial from 194.107.115.65 port 25484
Sep 30 18:21:13 compute-1 sshd-session[278703]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:21:13 compute-1 sshd-session[278703]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:21:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:14 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5ab8004360 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:14 compute-1 nova_compute[238822]: 2025-09-30 18:21:14.189 2 DEBUG nova.scheduler.client.report [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:21:14 compute-1 nova_compute[238822]: 2025-09-30 18:21:14.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2323369020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:14 compute-1 ceph-mon[75484]: pgmap v1278: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 11 KiB/s wr, 30 op/s
Sep 30 18:21:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:14.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:14 compute-1 nova_compute[238822]: 2025-09-30 18:21:14.950 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.875s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:14 compute-1 nova_compute[238822]: 2025-09-30 18:21:14.984 2 INFO nova.scheduler.client.report [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Deleted allocations for instance cbd42725-3e20-4602-8e30-926ab9bb7865
Sep 30 18:21:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:15 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5abc0095a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:15 compute-1 sshd-session[278703]: Failed password for invalid user trial from 194.107.115.65 port 25484 ssh2
Sep 30 18:21:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:15.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.021 2 DEBUG oslo_concurrency.lockutils [None req-60761c1d-d8b3-4653-beed-898ddd2a3f38 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbd42725-3e20-4602-8e30-926ab9bb7865" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.087s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:16 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a88003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:16.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.828 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbbd84c1-d174-40d7-be54-3123704f0e0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.829 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.830 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.830 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.831 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:16 compute-1 nova_compute[238822]: 2025-09-30 18:21:16.849 2 INFO nova.compute.manager [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Terminating instance
Sep 30 18:21:17 compute-1 kernel: ganesha.nfsd[277297]: segfault at 50 ip 00007f5b6a67c32e sp 00007f5b1effc210 error 4 in libntirpc.so.5.8[7f5b6a661000+2c000] likely on CPU 4 (core 0, socket 4)
Sep 30 18:21:17 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:21:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[276964]: 30/09/2025 18:21:17 : epoch 68dc1f32 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5a94004530 fd 37 proxy ignored for local
Sep 30 18:21:17 compute-1 systemd[1]: Started Process Core Dump (PID 278732/UID 0).
Sep 30 18:21:17 compute-1 ceph-mon[75484]: pgmap v1279: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:21:17 compute-1 sshd-session[278703]: Received disconnect from 194.107.115.65 port 25484:11: Bye Bye [preauth]
Sep 30 18:21:17 compute-1 sshd-session[278703]: Disconnected from invalid user trial 194.107.115.65 port 25484 [preauth]
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.369 2 DEBUG nova.compute.manager [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:21:17 compute-1 kernel: tapca4e45b7-a4 (unregistering): left promiscuous mode
Sep 30 18:21:17 compute-1 NetworkManager[45549]: <info>  [1759256477.4352] device (tapca4e45b7-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:21:17 compute-1 ovn_controller[135204]: 2025-09-30T18:21:17Z|00114|binding|INFO|Releasing lport ca4e45b7-a42b-4e47-80d8-749194caf98a from this chassis (sb_readonly=0)
Sep 30 18:21:17 compute-1 ovn_controller[135204]: 2025-09-30T18:21:17Z|00115|binding|INFO|Setting lport ca4e45b7-a42b-4e47-80d8-749194caf98a down in Southbound
Sep 30 18:21:17 compute-1 ovn_controller[135204]: 2025-09-30T18:21:17Z|00116|binding|INFO|Removing iface tapca4e45b7-a4 ovn-installed in OVS
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:17.457 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:6b:bd 10.100.0.7'], port_security=['fa:16:3e:7e:6b:bd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'cbbd84c1-d174-40d7-be54-3123704f0e0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '14', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=ca4e45b7-a42b-4e47-80d8-749194caf98a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:21:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:17.458 144543 INFO neutron.agent.ovn.metadata.agent [-] Port ca4e45b7-a42b-4e47-80d8-749194caf98a in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:21:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:17.460 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 443be7ca-f628-4a45-95b6-620d37172d7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:21:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:17.462 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[aac3c9ce-a8dd-4fe9-b4b2-af40af8437b5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:17.462 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b namespace which is not needed anymore
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:17 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Sep 30 18:21:17 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 2.945s CPU time.
Sep 30 18:21:17 compute-1 systemd-machined[195911]: Machine qemu-9-instance-0000000c terminated.
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.582 2 DEBUG nova.compute.manager [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Received event network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.585 2 DEBUG oslo_concurrency.lockutils [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.585 2 DEBUG oslo_concurrency.lockutils [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.586 2 DEBUG oslo_concurrency.lockutils [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.586 2 DEBUG nova.compute.manager [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] No waiting events found dispatching network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.586 2 DEBUG nova.compute.manager [req-380c227f-e994-4719-ba90-95815b87031e req-5c499196-4069-4ebf-8d52-a1f8ec447dd2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Received event network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.621 2 INFO nova.virt.libvirt.driver [-] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Instance destroyed successfully.
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.622 2 DEBUG nova.objects.instance [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'resources' on Instance uuid cbbd84c1-d174-40d7-be54-3123704f0e0b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:21:17 compute-1 nova_compute[238822]: 2025-09-30 18:21:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:17 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [NOTICE]   (278000) : haproxy version is 3.0.5-8e879a5
Sep 30 18:21:17 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [NOTICE]   (278000) : path to executable is /usr/sbin/haproxy
Sep 30 18:21:17 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [WARNING]  (278000) : Exiting Master process...
Sep 30 18:21:17 compute-1 podman[278758]: 2025-09-30 18:21:17.66133425 +0000 UTC m=+0.051994855 container kill d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:21:17 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [ALERT]    (278000) : Current worker (278002) exited with code 143 (Terminated)
Sep 30 18:21:17 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[277996]: [WARNING]  (278000) : All workers exited. Exiting... (0)
Sep 30 18:21:17 compute-1 systemd[1]: libpod-d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6.scope: Deactivated successfully.
Sep 30 18:21:17 compute-1 podman[278784]: 2025-09-30 18:21:17.714753413 +0000 UTC m=+0.031659826 container died d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 18:21:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-f68484b35ca4d985713451cb30c5168888a5cb45ddb6d6ec9894fbb717056a77-merged.mount: Deactivated successfully.
Sep 30 18:21:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6-userdata-shm.mount: Deactivated successfully.
Sep 30 18:21:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:17.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.131 2 DEBUG nova.virt.libvirt.vif [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1407640112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1407640112',id=12,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:20:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-mj5oq7ov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:21:02Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=cbbd84c1-d174-40d7-be54-3123704f0e0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.131 2 DEBUG nova.network.os_vif_util [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "address": "fa:16:3e:7e:6b:bd", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca4e45b7-a4", "ovs_interfaceid": "ca4e45b7-a42b-4e47-80d8-749194caf98a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.132 2 DEBUG nova.network.os_vif_util [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.132 2 DEBUG os_vif [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.135 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca4e45b7-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=29bc35bd-25eb-4f6b-b8af-b5521e8eb12b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.144 2 INFO os_vif [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:6b:bd,bridge_name='br-int',has_traffic_filtering=True,id=ca4e45b7-a42b-4e47-80d8-749194caf98a,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca4e45b7-a4')
Sep 30 18:21:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:18.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:18 compute-1 ceph-mon[75484]: pgmap v1280: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:21:18 compute-1 podman[278784]: 2025-09-30 18:21:18.665753368 +0000 UTC m=+0.982659801 container cleanup d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:21:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:18 compute-1 systemd[1]: libpod-conmon-d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6.scope: Deactivated successfully.
Sep 30 18:21:18 compute-1 systemd-coredump[278733]: Process 276968 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007f5b6a67c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:21:18 compute-1 podman[278786]: 2025-09-30 18:21:18.693453966 +0000 UTC m=+0.997262165 container remove d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.703 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6703a00a-fbde-4cf6-83d1-38b43167f125]: (4, ("Tue Sep 30 06:21:17 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b (d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6)\nd6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6\nTue Sep 30 06:21:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b (d6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6)\nd6bc33d6c1689187fae45c18122b279675d4a29a7ffa4c6f50be7558285287b6\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.705 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[255ea511-141f-4ccf-889d-0ef12f92e183]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.706 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.706 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[52d95d3a-603e-4d5d-98a4-77583a381fbe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.707 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 kernel: tap443be7ca-f0: left promiscuous mode
Sep 30 18:21:18 compute-1 nova_compute[238822]: 2025-09-30 18:21:18.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.774 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1e34a100-e7ab-43c7-b559-ef76bd4a5445]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.815 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[60faa4f6-d735-4d1d-9b91-86d004152955]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.816 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[85621c27-5a0a-4a13-b02b-6f14629e7bc7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.831 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[626ce670-c1f2-4dda-a795-cfb5129ec6a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1426292, 'reachable_time': 20513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278835, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 systemd[1]: systemd-coredump@12-278732-0.service: Deactivated successfully.
Sep 30 18:21:18 compute-1 systemd[1]: systemd-coredump@12-278732-0.service: Consumed 1.336s CPU time.
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.838 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:21:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:18.839 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[bbce9aed-e934-4e4f-b6c3-f4aa5d6e9d3d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:21:18 compute-1 systemd[1]: run-netns-ovnmeta\x2d443be7ca\x2df628\x2d4a45\x2d95b6\x2d620d37172d7b.mount: Deactivated successfully.
Sep 30 18:21:18 compute-1 podman[278839]: 2025-09-30 18:21:18.910964211 +0000 UTC m=+0.035220192 container died af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Sep 30 18:21:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-3dbd792853c30aec2f71562e426b9a8ca8c04825371ca80f454c5cd0c700dc09-merged.mount: Deactivated successfully.
Sep 30 18:21:18 compute-1 podman[278839]: 2025-09-30 18:21:18.957788836 +0000 UTC m=+0.082044727 container remove af1980dec28c276e20bfb40094eaf89ce54439d134ca84ea11e889fc0235989a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:21:18 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.097 2 INFO nova.virt.libvirt.driver [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Deleting instance files /var/lib/nova/instances/cbbd84c1-d174-40d7-be54-3123704f0e0b_del
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.098 2 INFO nova.virt.libvirt.driver [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Deletion of /var/lib/nova/instances/cbbd84c1-d174-40d7-be54-3123704f0e0b_del complete
Sep 30 18:21:19 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:21:19 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.964s CPU time.
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: ERROR   18:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: ERROR   18:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: ERROR   18:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: ERROR   18:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: ERROR   18:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:21:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.613 2 INFO nova.compute.manager [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Took 2.24 seconds to destroy the instance on the hypervisor.
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.614 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.615 2 DEBUG nova.compute.manager [-] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.615 2 DEBUG nova.network.neutron [-] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.615 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.674 2 DEBUG nova.compute.manager [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Received event network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.674 2 DEBUG oslo_concurrency.lockutils [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.675 2 DEBUG oslo_concurrency.lockutils [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.675 2 DEBUG oslo_concurrency.lockutils [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.675 2 DEBUG nova.compute.manager [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] No waiting events found dispatching network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:21:19 compute-1 nova_compute[238822]: 2025-09-30 18:21:19.676 2 DEBUG nova.compute.manager [req-e9550029-e721-4ed6-8b56-5cf43f4c2c9b req-432d534c-ae2c-4220-8f68-e528c566cae2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Received event network-vif-unplugged-ca4e45b7-a42b-4e47-80d8-749194caf98a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:21:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:19.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:20 compute-1 nova_compute[238822]: 2025-09-30 18:21:20.124 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:21:20 compute-1 ceph-mon[75484]: pgmap v1281: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 3.5 KiB/s wr, 31 op/s
Sep 30 18:21:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:20.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:20 compute-1 nova_compute[238822]: 2025-09-30 18:21:20.938 2 DEBUG nova.network.neutron [-] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:21:21 compute-1 nova_compute[238822]: 2025-09-30 18:21:21.450 2 INFO nova.compute.manager [-] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Took 1.84 seconds to deallocate network for instance.
Sep 30 18:21:21 compute-1 nova_compute[238822]: 2025-09-30 18:21:21.734 2 DEBUG nova.compute.manager [req-f69f5559-964e-400e-af6a-53ef28661a6f req-1b318a51-dce4-4bac-9fbe-c268e3935e9d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cbbd84c1-d174-40d7-be54-3123704f0e0b] Received event network-vif-deleted-ca4e45b7-a42b-4e47-80d8-749194caf98a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:21:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:21.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:21 compute-1 nova_compute[238822]: 2025-09-30 18:21:21.975 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:21 compute-1 nova_compute[238822]: 2025-09-30 18:21:21.976 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:21 compute-1 nova_compute[238822]: 2025-09-30 18:21:21.982 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:22 compute-1 nova_compute[238822]: 2025-09-30 18:21:22.030 2 INFO nova.scheduler.client.report [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Deleted allocations for instance cbbd84c1-d174-40d7-be54-3123704f0e0b
Sep 30 18:21:22 compute-1 ceph-mon[75484]: pgmap v1282: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 30 op/s
Sep 30 18:21:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:21:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:22.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:22 compute-1 nova_compute[238822]: 2025-09-30 18:21:22.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:23 compute-1 nova_compute[238822]: 2025-09-30 18:21:23.062 2 DEBUG oslo_concurrency.lockutils [None req-50274ab5-fb2a-4cb7-a0a5-d8c57a5c9856 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cbbd84c1-d174-40d7-be54-3123704f0e0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:23 compute-1 nova_compute[238822]: 2025-09-30 18:21:23.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:23.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/182124 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:21:24 compute-1 ceph-mon[75484]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:21:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:24.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:25.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:26 compute-1 nova_compute[238822]: 2025-09-30 18:21:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:26 compute-1 nova_compute[238822]: 2025-09-30 18:21:26.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:21:26 compute-1 ceph-mon[75484]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:21:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:26.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:27 compute-1 podman[278893]: 2025-09-30 18:21:27.569793782 +0000 UTC m=+0.099753895 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:21:27 compute-1 podman[278892]: 2025-09-30 18:21:27.619153305 +0000 UTC m=+0.150949468 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:21:27 compute-1 nova_compute[238822]: 2025-09-30 18:21:27.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:27.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:28 compute-1 nova_compute[238822]: 2025-09-30 18:21:28.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:28 compute-1 ceph-mon[75484]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:21:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:28.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:29 compute-1 nova_compute[238822]: 2025-09-30 18:21:29.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:29 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 13.
Sep 30 18:21:29 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:21:29 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.964s CPU time.
Sep 30 18:21:29 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b...
Sep 30 18:21:29 compute-1 podman[278993]: 2025-09-30 18:21:29.722517714 +0000 UTC m=+0.043019363 container create c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 18:21:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24343fc14e1d66f24f030c759105dc2bf3925fa5050c074fb2434d26d0cfbbfd/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Sep 30 18:21:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24343fc14e1d66f24f030c759105dc2bf3925fa5050c074fb2434d26d0cfbbfd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:21:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24343fc14e1d66f24f030c759105dc2bf3925fa5050c074fb2434d26d0cfbbfd/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:21:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24343fc14e1d66f24f030c759105dc2bf3925fa5050c074fb2434d26d0cfbbfd/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bsnzkg-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Sep 30 18:21:29 compute-1 podman[278993]: 2025-09-30 18:21:29.786991045 +0000 UTC m=+0.107492774 container init c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:29 compute-1 podman[278993]: 2025-09-30 18:21:29.798330682 +0000 UTC m=+0.118832361 container start c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True)
Sep 30 18:21:29 compute-1 podman[278993]: 2025-09-30 18:21:29.704040285 +0000 UTC m=+0.024541954 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:21:29 compute-1 bash[278993]: c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d
Sep 30 18:21:29 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Sep 30 18:21:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:21:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:21:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Sep 30 18:21:30 compute-1 ceph-mon[75484]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:21:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:30.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:30 compute-1 sshd[170789]: drop connection #1 from [110.42.70.108]:33896 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:21:31 compute-1 nova_compute[238822]: 2025-09-30 18:21:31.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:32 compute-1 unix_chkpwd[279055]: password check failed for user (root)
Sep 30 18:21:32 compute-1 sshd-session[279051]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:21:32 compute-1 ceph-mon[75484]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 16 KiB/s rd, 1.2 KiB/s wr, 24 op/s
Sep 30 18:21:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:32.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:32 compute-1 sudo[279056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:21:32 compute-1 sudo[279056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:32 compute-1 sudo[279056]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:32 compute-1 nova_compute[238822]: 2025-09-30 18:21:32.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:33 compute-1 nova_compute[238822]: 2025-09-30 18:21:33.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2861977476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:33 compute-1 sshd-session[279081]: Invalid user reelforge from 175.126.165.170 port 50580
Sep 30 18:21:33 compute-1 sshd-session[279081]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:21:33 compute-1 sshd-session[279081]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:21:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:33.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:34 compute-1 sshd-session[279051]: Failed password for root from 192.210.160.141 port 44988 ssh2
Sep 30 18:21:34 compute-1 nova_compute[238822]: 2025-09-30 18:21:34.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:34 compute-1 ceph-mon[75484]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.6 KiB/s wr, 26 op/s
Sep 30 18:21:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:34.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:35 compute-1 unix_chkpwd[279088]: password check failed for user (root)
Sep 30 18:21:35 compute-1 sshd-session[279084]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:35 compute-1 sshd-session[279051]: Connection closed by authenticating user root 192.210.160.141 port 44988 [preauth]
Sep 30 18:21:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2315415786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:35 compute-1 podman[279089]: 2025-09-30 18:21:35.572438507 +0000 UTC m=+0.098052039 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:21:35 compute-1 nova_compute[238822]: 2025-09-30 18:21:35.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:21:35 compute-1 podman[249638]: time="2025-09-30T18:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:21:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:21:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8829 "" "Go-http-client/1.1"
Sep 30 18:21:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:35.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:35 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Sep 30 18:21:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:35 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Sep 30 18:21:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:21:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/904585573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.075 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:21:36 compute-1 sshd-session[279081]: Failed password for invalid user reelforge from 175.126.165.170 port 50580 ssh2
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.292 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.294 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.329 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.330 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4720MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.331 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:36 compute-1 nova_compute[238822]: 2025-09-30 18:21:36.332 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/904585573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:36 compute-1 ceph-mon[75484]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:21:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:36.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:37 compute-1 sshd-session[279081]: Received disconnect from 175.126.165.170 port 50580:11: Bye Bye [preauth]
Sep 30 18:21:37 compute-1 sshd-session[279081]: Disconnected from invalid user reelforge 175.126.165.170 port 50580 [preauth]
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.399 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.399 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:21:36 up  3:58,  0 user,  load average: 0.41, 0.54, 0.90\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.419 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:21:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2895475536' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:21:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2895475536' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:21:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:37 compute-1 sshd-session[279084]: Failed password for root from 14.225.167.110 port 37738 ssh2
Sep 30 18:21:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:37.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:21:37 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/447436546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.944 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:21:37 compute-1 nova_compute[238822]: 2025-09-30 18:21:37.952 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:21:38 compute-1 sshd-session[279084]: Received disconnect from 14.225.167.110 port 37738:11: Bye Bye [preauth]
Sep 30 18:21:38 compute-1 sshd-session[279084]: Disconnected from authenticating user root 14.225.167.110 port 37738 [preauth]
Sep 30 18:21:38 compute-1 nova_compute[238822]: 2025-09-30 18:21:38.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:38 compute-1 nova_compute[238822]: 2025-09-30 18:21:38.462 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:21:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/447436546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:38 compute-1 ceph-mon[75484]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Sep 30 18:21:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:38.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:38 compute-1 nova_compute[238822]: 2025-09-30 18:21:38.975 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:21:38 compute-1 nova_compute[238822]: 2025-09-30 18:21:38.975 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.643s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:39.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:40 compute-1 unix_chkpwd[279162]: password check failed for user (root)
Sep 30 18:21:40 compute-1 sshd-session[279159]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:21:40 compute-1 ceph-mon[75484]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:21:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:40.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:40 compute-1 nova_compute[238822]: 2025-09-30 18:21:40.972 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:40 compute-1 nova_compute[238822]: 2025-09-30 18:21:40.973 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:40 compute-1 nova_compute[238822]: 2025-09-30 18:21:40.973 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:41.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:21:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Sep 30 18:21:42 compute-1 nova_compute[238822]: 2025-09-30 18:21:42.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:42 compute-1 ceph-mon[75484]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s
Sep 30 18:21:42 compute-1 sshd-session[279159]: Failed password for root from 216.10.242.161 port 44318 ssh2
Sep 30 18:21:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:42.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:42 compute-1 podman[279181]: 2025-09-30 18:21:42.570047162 +0000 UTC m=+0.093265340 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Sep 30 18:21:42 compute-1 podman[279180]: 2025-09-30 18:21:42.581549533 +0000 UTC m=+0.106422095 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Sep 30 18:21:42 compute-1 podman[279179]: 2025-09-30 18:21:42.591649516 +0000 UTC m=+0.119632442 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:21:42 compute-1 nova_compute[238822]: 2025-09-30 18:21:42.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:43 compute-1 sshd-session[279159]: Received disconnect from 216.10.242.161 port 44318:11: Bye Bye [preauth]
Sep 30 18:21:43 compute-1 sshd-session[279159]: Disconnected from authenticating user root 216.10.242.161 port 44318 [preauth]
Sep 30 18:21:43 compute-1 nova_compute[238822]: 2025-09-30 18:21:43.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:43 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:43.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:43 compute-1 sudo[279237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:21:43 compute-1 sudo[279237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:43 compute-1 sudo[279237]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:44 compute-1 sudo[279262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:21:44 compute-1 sudo[279262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/182144 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Sep 30 18:21:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:44 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:44 compute-1 ceph-mon[75484]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:21:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:44.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:44 compute-1 podman[279364]: 2025-09-30 18:21:44.932971962 +0000 UTC m=+0.097086963 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Sep 30 18:21:45 compute-1 podman[279364]: 2025-09-30 18:21:45.068474802 +0000 UTC m=+0.232589753 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Sep 30 18:21:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:45 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:21:45 compute-1 unix_chkpwd[279457]: password check failed for user (root)
Sep 30 18:21:45 compute-1 sshd-session[279315]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:21:45 compute-1 podman[279484]: 2025-09-30 18:21:45.783797691 +0000 UTC m=+0.145016248 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:21:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:45 compute-1 podman[279484]: 2025-09-30 18:21:45.822380473 +0000 UTC m=+0.183599060 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:21:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:45.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:46 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:46 compute-1 podman[279576]: 2025-09-30 18:21:46.313508188 +0000 UTC m=+0.079195300 container exec c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Sep 30 18:21:46 compute-1 podman[279576]: 2025-09-30 18:21:46.354935856 +0000 UTC m=+0.120622968 container exec_died c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Sep 30 18:21:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3815262515' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:21:46 compute-1 ceph-mon[75484]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:21:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:46.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:46 compute-1 podman[279645]: 2025-09-30 18:21:46.72678139 +0000 UTC m=+0.097102334 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:21:46 compute-1 podman[279645]: 2025-09-30 18:21:46.744124448 +0000 UTC m=+0.114445312 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:21:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:47 compute-1 podman[279711]: 2025-09-30 18:21:47.045022995 +0000 UTC m=+0.089294813 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2)
Sep 30 18:21:47 compute-1 podman[279711]: 2025-09-30 18:21:47.067129972 +0000 UTC m=+0.111401780 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph.)
Sep 30 18:21:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:47 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:47 compute-1 sudo[279262]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:47 compute-1 sudo[279781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:21:47 compute-1 sudo[279781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:47 compute-1 sudo[279781]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:47 compute-1 sudo[279806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:21:47 compute-1 sudo[279806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:47 compute-1 nova_compute[238822]: 2025-09-30 18:21:47.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:47 compute-1 sshd-session[279315]: Failed password for root from 84.51.43.58 port 53334 ssh2
Sep 30 18:21:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:47.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:48 compute-1 nova_compute[238822]: 2025-09-30 18:21:48.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:48 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:48 compute-1 sudo[279806]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:48 compute-1 ceph-mon[75484]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:21:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:21:48 compute-1 sshd-session[279315]: Received disconnect from 84.51.43.58 port 53334:11: Bye Bye [preauth]
Sep 30 18:21:48 compute-1 sshd-session[279315]: Disconnected from authenticating user root 84.51.43.58 port 53334 [preauth]
Sep 30 18:21:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:48.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:49 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: ERROR   18:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: ERROR   18:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: ERROR   18:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: ERROR   18:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: ERROR   18:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:21:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:21:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:49.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:50 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:50.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:51 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:51 compute-1 ceph-mon[75484]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Sep 30 18:21:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:21:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:52 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:21:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:52.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:52 compute-1 nova_compute[238822]: 2025-09-30 18:21:52.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:52 compute-1 sudo[279869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:21:52 compute-1 sudo[279869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:52 compute-1 sudo[279869]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:53 compute-1 nova_compute[238822]: 2025-09-30 18:21:53.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:53 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:53 compute-1 ceph-mon[75484]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 242 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Sep 30 18:21:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/377580213' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:21:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:53 compute-1 sudo[279895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:21:53 compute-1 sudo[279895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:21:53 compute-1 sudo[279895]: pam_unix(sudo:session): session closed for user root
Sep 30 18:21:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:21:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:53.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:21:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:54 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:54.375 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:21:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:54.375 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:21:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:54.376 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:21:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:54.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:54 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:54 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:21:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4008188633' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:21:54 compute-1 ceph-mon[75484]: pgmap v1298: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:21:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:55 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:55.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:56 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:56 compute-1 ceph-mon[75484]: pgmap v1299: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:21:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:56.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:56 compute-1 unix_chkpwd[279926]: password check failed for user (daemon)
Sep 30 18:21:56 compute-1 sshd-session[279923]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=daemon
Sep 30 18:21:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:57 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2571417486' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:21:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2571417486' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:21:57 compute-1 nova_compute[238822]: 2025-09-30 18:21:57.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:57.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:58 compute-1 nova_compute[238822]: 2025-09-30 18:21:58.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:58 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:58 compute-1 sshd-session[279923]: Failed password for daemon from 192.210.160.141 port 58466 ssh2
Sep 30 18:21:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:21:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:21:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:21:58 compute-1 podman[279930]: 2025-09-30 18:21:58.568954419 +0000 UTC m=+0.099683303 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:21:58 compute-1 nova_compute[238822]: 2025-09-30 18:21:58.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:21:58 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:58.577 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:21:58 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:21:58.578 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:21:58 compute-1 ceph-mon[75484]: pgmap v1300: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:21:58 compute-1 podman[279929]: 2025-09-30 18:21:58.64120621 +0000 UTC m=+0.176366724 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Sep 30 18:21:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:21:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:58 compute-1 sshd-session[279923]: Connection closed by authenticating user daemon 192.210.160.141 port 58466 [preauth]
Sep 30 18:21:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:21:59 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:21:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:21:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:21:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:21:59 compute-1 sshd-session[279949]: Invalid user nodeuser from 103.153.190.105 port 58782
Sep 30 18:21:59 compute-1 sshd-session[279949]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:21:59 compute-1 sshd-session[279949]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:21:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:21:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:21:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:21:59.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:00 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003450 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:00 compute-1 ceph-mon[75484]: pgmap v1301: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:22:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:00.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:01 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003450 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:01.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:02 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:02 compute-1 ceph-mon[75484]: pgmap v1302: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:22:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:02.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:02 compute-1 sshd-session[279949]: Failed password for invalid user nodeuser from 103.153.190.105 port 58782 ssh2
Sep 30 18:22:02 compute-1 nova_compute[238822]: 2025-09-30 18:22:02.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:03 compute-1 nova_compute[238822]: 2025-09-30 18:22:03.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:03 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:03.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:04 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003450 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:04 compute-1 sshd-session[279949]: Received disconnect from 103.153.190.105 port 58782:11: Bye Bye [preauth]
Sep 30 18:22:04 compute-1 sshd-session[279949]: Disconnected from invalid user nodeuser 103.153.190.105 port 58782 [preauth]
Sep 30 18:22:04 compute-1 ceph-mon[75484]: pgmap v1303: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:22:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:04.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:04.580 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:04 compute-1 nova_compute[238822]: 2025-09-30 18:22:04.921 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:04 compute-1 nova_compute[238822]: 2025-09-30 18:22:04.922 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:05 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:05 compute-1 nova_compute[238822]: 2025-09-30 18:22:05.429 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:22:05 compute-1 podman[249638]: time="2025-09-30T18:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:22:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:22:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8830 "" "Go-http-client/1.1"
Sep 30 18:22:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:05.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:05 compute-1 nova_compute[238822]: 2025-09-30 18:22:05.975 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:05 compute-1 nova_compute[238822]: 2025-09-30 18:22:05.975 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:05 compute-1 nova_compute[238822]: 2025-09-30 18:22:05.985 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:22:05 compute-1 nova_compute[238822]: 2025-09-30 18:22:05.986 2 INFO nova.compute.claims [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:22:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:06 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:06 compute-1 ceph-mon[75484]: pgmap v1304: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:22:06 compute-1 podman[279992]: 2025-09-30 18:22:06.547876856 +0000 UTC m=+0.090414733 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:22:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:06.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:07 compute-1 nova_compute[238822]: 2025-09-30 18:22:07.038 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:07 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:22:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:22:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1822497323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:07 compute-1 nova_compute[238822]: 2025-09-30 18:22:07.504 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:07 compute-1 nova_compute[238822]: 2025-09-30 18:22:07.514 2 DEBUG nova.compute.provider_tree [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:22:07 compute-1 nova_compute[238822]: 2025-09-30 18:22:07.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:08 compute-1 nova_compute[238822]: 2025-09-30 18:22:08.024 2 DEBUG nova.scheduler.client.report [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:22:08 compute-1 nova_compute[238822]: 2025-09-30 18:22:08.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:08 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac004550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1822497323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:08 compute-1 ceph-mon[75484]: pgmap v1305: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:22:08 compute-1 nova_compute[238822]: 2025-09-30 18:22:08.537 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.561s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:08 compute-1 nova_compute[238822]: 2025-09-30 18:22:08.537 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:22:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:08.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.061 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.062 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.063 2 WARNING neutronclient.v2_0.client [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.063 2 WARNING neutronclient.v2_0.client [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:09 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.572 2 INFO nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:22:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:09 compute-1 nova_compute[238822]: 2025-09-30 18:22:09.888 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Successfully created port: 9230ef6a-67da-4be6-9a8f-3fa248015ba4 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:22:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:10 compute-1 nova_compute[238822]: 2025-09-30 18:22:10.081 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:22:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:10 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:10 compute-1 ceph-mon[75484]: pgmap v1306: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:22:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:10.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.104 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.106 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.107 2 INFO nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Creating image(s)
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.137 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.168 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:11 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0003390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.204 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.209 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:11 compute-1 sshd-session[279986]: Connection closed by 113.249.93.94 port 33096 [preauth]
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.286 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.287 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.287 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.288 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.315 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.321 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 6410d0a2-466d-41d9-a863-f756714e17c5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.639 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 6410d0a2-466d-41d9-a863-f756714e17c5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.318s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.731 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Successfully updated port: 9230ef6a-67da-4be6-9a8f-3fa248015ba4 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.741 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] resizing rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.793 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.793 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquired lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.794 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.797 2 DEBUG nova.compute.manager [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-changed-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.797 2 DEBUG nova.compute.manager [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Refreshing instance network info cache due to event network-changed-9230ef6a-67da-4be6-9a8f-3fa248015ba4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.798 2 DEBUG oslo_concurrency.lockutils [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:22:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.873 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.874 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Ensure instance console log exists: /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.875 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.875 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:11 compute-1 nova_compute[238822]: 2025-09-30 18:22:11.876 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:12 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac004550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:12 compute-1 ceph-mon[75484]: pgmap v1307: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 63 op/s
Sep 30 18:22:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:12.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:12 compute-1 nova_compute[238822]: 2025-09-30 18:22:12.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:12 compute-1 sudo[280206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:22:12 compute-1 sudo[280206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:22:12 compute-1 sudo[280206]: pam_unix(sudo:session): session closed for user root
Sep 30 18:22:13 compute-1 podman[280232]: 2025-09-30 18:22:13.024926691 +0000 UTC m=+0.110760393 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, architecture=x86_64, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:22:13 compute-1 podman[280233]: 2025-09-30 18:22:13.024320904 +0000 UTC m=+0.106415335 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 18:22:13 compute-1 podman[280231]: 2025-09-30 18:22:13.033519313 +0000 UTC m=+0.124197085 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Sep 30 18:22:13 compute-1 nova_compute[238822]: 2025-09-30 18:22:13.143 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:22:13 compute-1 nova_compute[238822]: 2025-09-30 18:22:13.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:13 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:13 compute-1 nova_compute[238822]: 2025-09-30 18:22:13.323 2 WARNING neutronclient.v2_0.client [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:13.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:14 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.272 2 DEBUG nova.network.neutron [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Updating instance_info_cache with network_info: [{"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:22:14 compute-1 ceph-mon[75484]: pgmap v1308: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 154 op/s
Sep 30 18:22:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:22:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:14.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.780 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Releasing lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.780 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Instance network_info: |[{"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.781 2 DEBUG oslo_concurrency.lockutils [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.781 2 DEBUG nova.network.neutron [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Refreshing network info cache for port 9230ef6a-67da-4be6-9a8f-3fa248015ba4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.784 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Start _get_guest_xml network_info=[{"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.789 2 WARNING nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.790 2 DEBUG nova.virt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1149902309', uuid='6410d0a2-466d-41d9-a863-f756714e17c5'), owner=OwnerMeta(userid='57be6c3d2e0d431dae0127ac659de1e0', username='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin', projectid='af4ef07c582847419a03275af50c6ffc', projectname='tempest-TestExecuteHostMaintenanceStrategy-1597156537'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256534.7908044) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.795 2 DEBUG nova.virt.libvirt.host [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.796 2 DEBUG nova.virt.libvirt.host [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.799 2 DEBUG nova.virt.libvirt.host [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.799 2 DEBUG nova.virt.libvirt.host [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.800 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.800 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.800 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.801 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.801 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.801 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.801 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.801 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.802 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:22:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.802 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.802 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.802 2 DEBUG nova.virt.hardware [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:22:14 compute-1 nova_compute[238822]: 2025-09-30 18:22:14.806 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:15 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:22:15 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1168738203' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.291 2 WARNING neutronclient.v2_0.client [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.313 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.351 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.358 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1168738203' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:22:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:22:15 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/777441767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:22:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.830 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.833 2 DEBUG nova.virt.libvirt.vif [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:22:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1149902309',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1149902309',id=15,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-0z3b6qta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:22:10Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=6410d0a2-466d-41d9-a863-f756714e17c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.834 2 DEBUG nova.network.os_vif_util [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.836 2 DEBUG nova.network.os_vif_util [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:22:15 compute-1 nova_compute[238822]: 2025-09-30 18:22:15.838 2 DEBUG nova.objects.instance [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'pci_devices' on Instance uuid 6410d0a2-466d-41d9-a863-f756714e17c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:22:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:16 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.352 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <uuid>6410d0a2-466d-41d9-a863-f756714e17c5</uuid>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <name>instance-0000000f</name>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1149902309</nova:name>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:22:14</nova:creationTime>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:22:16 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:22:16 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:user uuid="57be6c3d2e0d431dae0127ac659de1e0">tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin</nova:user>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:project uuid="af4ef07c582847419a03275af50c6ffc">tempest-TestExecuteHostMaintenanceStrategy-1597156537</nova:project>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <nova:port uuid="9230ef6a-67da-4be6-9a8f-3fa248015ba4">
Sep 30 18:22:16 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <system>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="serial">6410d0a2-466d-41d9-a863-f756714e17c5</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="uuid">6410d0a2-466d-41d9-a863-f756714e17c5</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </system>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <os>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </os>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <features>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </features>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/6410d0a2-466d-41d9-a863-f756714e17c5_disk">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </source>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/6410d0a2-466d-41d9-a863-f756714e17c5_disk.config">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </source>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:22:16 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:5c:4d:33"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <target dev="tap9230ef6a-67"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/console.log" append="off"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <video>
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </video>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:22:16 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:22:16 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:22:16 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:22:16 compute-1 nova_compute[238822]: </domain>
Sep 30 18:22:16 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.353 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Preparing to wait for external event network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.354 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.354 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.354 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.354 2 DEBUG nova.virt.libvirt.vif [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:22:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1149902309',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1149902309',id=15,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-0z3b6qta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:22:10Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=6410d0a2-466d-41d9-a863-f756714e17c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.355 2 DEBUG nova.network.os_vif_util [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.355 2 DEBUG nova.network.os_vif_util [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.355 2 DEBUG os_vif [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.356 2 WARNING neutronclient.v2_0.client [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.360 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '233c8dda-1da1-5972-9048-d80820619ff8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9230ef6a-67, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9230ef6a-67, col_values=(('qos', UUID('05bca08c-12a3-4a6f-8bb4-4050c8a44dc5')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9230ef6a-67, col_values=(('external_ids', {'iface-id': '9230ef6a-67da-4be6-9a8f-3fa248015ba4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5c:4d:33', 'vm-uuid': '6410d0a2-466d-41d9-a863-f756714e17c5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 NetworkManager[45549]: <info>  [1759256536.3699] manager: (tap9230ef6a-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.378 2 INFO os_vif [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67')
Sep 30 18:22:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/777441767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:22:16 compute-1 ceph-mon[75484]: pgmap v1309: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.515 2 DEBUG nova.network.neutron [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Updated VIF entry in instance network info cache for port 9230ef6a-67da-4be6-9a8f-3fa248015ba4. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:22:16 compute-1 nova_compute[238822]: 2025-09-30 18:22:16.516 2 DEBUG nova.network.neutron [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Updating instance_info_cache with network_info: [{"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:22:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:16.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.050 2 DEBUG oslo_concurrency.lockutils [req-111c6c99-16bd-4f8f-b86b-d990cb304085 req-857d6ea5-6f7f-4193-a5b0-6d1aa328cf6e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-6410d0a2-466d-41d9-a863-f756714e17c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:22:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:17 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.944 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.944 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.944 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] No VIF found with MAC fa:16:3e:5c:4d:33, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.945 2 INFO nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Using config drive
Sep 30 18:22:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:17.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:17 compute-1 nova_compute[238822]: 2025-09-30 18:22:17.979 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:18 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:18 compute-1 ceph-mon[75484]: pgmap v1310: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:22:18 compute-1 nova_compute[238822]: 2025-09-30 18:22:18.506 2 WARNING neutronclient.v2_0.client [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:18.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:18 compute-1 unix_chkpwd[280383]: password check failed for user (root)
Sep 30 18:22:18 compute-1 sshd-session[280380]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:22:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:19 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.241 2 INFO nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Creating config drive at /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.254 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpkbgbjct4 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.403 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpkbgbjct4" returned: 0 in 0.150s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: ERROR   18:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: ERROR   18:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: ERROR   18:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: ERROR   18:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: ERROR   18:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:22:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.468 2 DEBUG nova.storage.rbd_utils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] rbd image 6410d0a2-466d-41d9-a863-f756714e17c5_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.475 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config 6410d0a2-466d-41d9-a863-f756714e17c5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.707 2 DEBUG oslo_concurrency.processutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config 6410d0a2-466d-41d9-a863-f756714e17c5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.233s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.708 2 INFO nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Deleting local config drive /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5/disk.config because it was imported into RBD.
Sep 30 18:22:19 compute-1 kernel: tap9230ef6a-67: entered promiscuous mode
Sep 30 18:22:19 compute-1 NetworkManager[45549]: <info>  [1759256539.7993] manager: (tap9230ef6a-67): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:19 compute-1 ovn_controller[135204]: 2025-09-30T18:22:19Z|00117|binding|INFO|Claiming lport 9230ef6a-67da-4be6-9a8f-3fa248015ba4 for this chassis.
Sep 30 18:22:19 compute-1 ovn_controller[135204]: 2025-09-30T18:22:19Z|00118|binding|INFO|9230ef6a-67da-4be6-9a8f-3fa248015ba4: Claiming fa:16:3e:5c:4d:33 10.100.0.13
Sep 30 18:22:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:19 compute-1 ovn_controller[135204]: 2025-09-30T18:22:19Z|00119|binding|INFO|Setting lport 9230ef6a-67da-4be6-9a8f-3fa248015ba4 ovn-installed in OVS
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:19 compute-1 nova_compute[238822]: 2025-09-30 18:22:19.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:19 compute-1 systemd-udevd[280437]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:22:19 compute-1 systemd-machined[195911]: New machine qemu-10-instance-0000000f.
Sep 30 18:22:19 compute-1 NetworkManager[45549]: <info>  [1759256539.8634] device (tap9230ef6a-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:22:19 compute-1 NetworkManager[45549]: <info>  [1759256539.8645] device (tap9230ef6a-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:22:19 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000000f.
Sep 30 18:22:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:19.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:20 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:20 compute-1 ovn_controller[135204]: 2025-09-30T18:22:20Z|00120|binding|INFO|Setting lport 9230ef6a-67da-4be6-9a8f-3fa248015ba4 up in Southbound
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.390 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:4d:33 10.100.0.13'], port_security=['fa:16:3e:5c:4d:33 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6410d0a2-466d-41d9-a863-f756714e17c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=9230ef6a-67da-4be6-9a8f-3fa248015ba4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.391 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 9230ef6a-67da-4be6-9a8f-3fa248015ba4 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b bound to our chassis
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.394 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.416 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e7cac848-3513-410f-a848-e93a24cfcc67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.417 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap443be7ca-f1 in ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.419 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap443be7ca-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.420 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7983c5eb-8826-4890-92aa-c615561a0b26]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.422 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a7f55f-cc20-4179-a41c-478ffdf5a030]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.442 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[8baa564b-8290-4f1a-a265-068394380825]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ceph-mon[75484]: pgmap v1311: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.451 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee8f5ca-6842-4558-88ad-2bd8bb332c56]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 sshd-session[280380]: Failed password for root from 8.243.64.201 port 37204 ssh2
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.506 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[714f42e9-0ef7-45fc-abe3-3565513a61d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 systemd-udevd[280440]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:22:20 compute-1 NetworkManager[45549]: <info>  [1759256540.5184] manager: (tap443be7ca-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.518 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ff1bc1-413f-4d5e-b776-b5b5f4c0cd39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.571 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[575e2266-a103-4385-ada4-aaa64d51d4f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:20.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.575 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf105be-d359-43c4-a694-636d1dd1b572]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 NetworkManager[45549]: <info>  [1759256540.6079] device (tap443be7ca-f0): carrier: link connected
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.620 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[747cc48f-92ea-477c-add6-5ac31fbf0eef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.645 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[127644b4-2ffb-42ce-b561-b2e3d79c1ff1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1438218, 'reachable_time': 19300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280515, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.661 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[08a207cc-f71a-4162-a24d-edbabf6f0941]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:7f4d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1438218, 'tstamp': 1438218}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280516, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.681 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3420f8e6-8e99-4036-a911-a3b9777a8394]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1438218, 'reachable_time': 19300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280517, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.718 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ade5d16d-bf12-44e2-92a3-e56e3477340f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.809 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb9a214-3af4-4928-8af0-7586ca8e2f79]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.810 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.811 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.811 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:20 compute-1 nova_compute[238822]: 2025-09-30 18:22:20.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:20 compute-1 kernel: tap443be7ca-f0: entered promiscuous mode
Sep 30 18:22:20 compute-1 NetworkManager[45549]: <info>  [1759256540.8154] manager: (tap443be7ca-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Sep 30 18:22:20 compute-1 nova_compute[238822]: 2025-09-30 18:22:20.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.817 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:20 compute-1 nova_compute[238822]: 2025-09-30 18:22:20.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:20 compute-1 ovn_controller[135204]: 2025-09-30T18:22:20Z|00121|binding|INFO|Releasing lport 031d2cff-b142-4423-ba99-772183b7a667 from this chassis (sb_readonly=0)
Sep 30 18:22:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:20 compute-1 nova_compute[238822]: 2025-09-30 18:22:20.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.848 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2e88f4e3-f7b6-436f-b8f6-16a4ec5892f5]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.850 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.850 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.850 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 443be7ca-f628-4a45-95b6-620d37172d7b disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.850 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.851 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb901d1-231d-4ecf-90a9-73eeb75ac7d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.851 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.852 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8e30a2ba-ab23-47fb-9315-a7061288fed7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.852 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:22:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:20.853 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'env', 'PROCESS_TAG=haproxy-443be7ca-f628-4a45-95b6-620d37172d7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/443be7ca-f628-4a45-95b6-620d37172d7b.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:22:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:21 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.280 2 DEBUG nova.compute.manager [req-94bdd6ca-2e72-401c-ac40-98e2142762ef req-f5db4bb2-4871-4ea3-a884-195d6e0b685b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.281 2 DEBUG oslo_concurrency.lockutils [req-94bdd6ca-2e72-401c-ac40-98e2142762ef req-f5db4bb2-4871-4ea3-a884-195d6e0b685b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.281 2 DEBUG oslo_concurrency.lockutils [req-94bdd6ca-2e72-401c-ac40-98e2142762ef req-f5db4bb2-4871-4ea3-a884-195d6e0b685b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.281 2 DEBUG oslo_concurrency.lockutils [req-94bdd6ca-2e72-401c-ac40-98e2142762ef req-f5db4bb2-4871-4ea3-a884-195d6e0b685b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.282 2 DEBUG nova.compute.manager [req-94bdd6ca-2e72-401c-ac40-98e2142762ef req-f5db4bb2-4871-4ea3-a884-195d6e0b685b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Processing event network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.283 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.287 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.293 2 INFO nova.virt.libvirt.driver [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Instance spawned successfully.
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.294 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:22:21 compute-1 podman[280551]: 2025-09-30 18:22:21.350255674 +0000 UTC m=+0.080340441 container create 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 18:22:21 compute-1 podman[280551]: 2025-09-30 18:22:21.300700506 +0000 UTC m=+0.030785283 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:21 compute-1 systemd[1]: Started libpod-conmon-12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236.scope.
Sep 30 18:22:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:22:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bfb28e0460be9d4a72e54976bda0b56781c84f88c96c65749caf3162a95ce07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:22:21 compute-1 podman[280551]: 2025-09-30 18:22:21.492117256 +0000 UTC m=+0.222202093 container init 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 18:22:21 compute-1 podman[280551]: 2025-09-30 18:22:21.500237435 +0000 UTC m=+0.230322202 container start 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:22:21 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [NOTICE]   (280570) : New worker (280574) forked
Sep 30 18:22:21 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [NOTICE]   (280570) : Loading success.
Sep 30 18:22:21 compute-1 unix_chkpwd[280583]: password check failed for user (root)
Sep 30 18:22:21 compute-1 sshd-session[280406]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:22:21 compute-1 sshd-session[280380]: Received disconnect from 8.243.64.201 port 37204:11: Bye Bye [preauth]
Sep 30 18:22:21 compute-1 sshd-session[280380]: Disconnected from authenticating user root 8.243.64.201 port 37204 [preauth]
Sep 30 18:22:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.816 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.818 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.820 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.821 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.822 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 nova_compute[238822]: 2025-09-30 18:22:21.823 2 DEBUG nova.virt.libvirt.driver [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:22:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:22 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:22 compute-1 nova_compute[238822]: 2025-09-30 18:22:22.335 2 INFO nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Took 11.23 seconds to spawn the instance on the hypervisor.
Sep 30 18:22:22 compute-1 nova_compute[238822]: 2025-09-30 18:22:22.336 2 DEBUG nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:22:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:22:22 compute-1 sshd-session[280572]: Invalid user graylog from 194.107.115.65 port 49956
Sep 30 18:22:22 compute-1 sshd-session[280572]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:22:22 compute-1 sshd-session[280572]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65
Sep 30 18:22:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:22:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:22.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:22:22 compute-1 nova_compute[238822]: 2025-09-30 18:22:22.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:22 compute-1 nova_compute[238822]: 2025-09-30 18:22:22.875 2 INFO nova.compute.manager [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Took 16.93 seconds to build instance.
Sep 30 18:22:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:23 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.341 2 DEBUG nova.compute.manager [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.342 2 DEBUG oslo_concurrency.lockutils [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.343 2 DEBUG oslo_concurrency.lockutils [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.343 2 DEBUG oslo_concurrency.lockutils [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.344 2 DEBUG nova.compute.manager [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] No waiting events found dispatching network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.344 2 WARNING nova.compute.manager [req-5aa250bf-77e1-41a6-9e63-a26f307b0e0f req-98c9016a-952e-4ff6-9743-5652032d3c4e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received unexpected event network-vif-plugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 for instance with vm_state active and task_state None.
Sep 30 18:22:23 compute-1 ceph-mon[75484]: pgmap v1312: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:22:23 compute-1 nova_compute[238822]: 2025-09-30 18:22:23.381 2 DEBUG oslo_concurrency.lockutils [None req-700c8a32-38d4-4c70-9119-e3ac8e3d36fd 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.459s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:23 compute-1 sshd-session[280406]: Failed password for root from 192.210.160.141 port 44488 ssh2
Sep 30 18:22:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:22:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:23.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:22:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:24 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:24 compute-1 ceph-mon[75484]: pgmap v1313: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 165 op/s
Sep 30 18:22:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:24.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:24 compute-1 sshd-session[280406]: Connection closed by authenticating user root 192.210.160.141 port 44488 [preauth]
Sep 30 18:22:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:25 compute-1 sshd-session[280572]: Failed password for invalid user graylog from 194.107.115.65 port 49956 ssh2
Sep 30 18:22:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:25 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:25.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:26 compute-1 nova_compute[238822]: 2025-09-30 18:22:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:26 compute-1 nova_compute[238822]: 2025-09-30 18:22:26.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:22:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:26 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:26 compute-1 ceph-mon[75484]: pgmap v1314: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:22:26 compute-1 nova_compute[238822]: 2025-09-30 18:22:26.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:26.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:26 compute-1 sshd-session[280572]: Received disconnect from 194.107.115.65 port 49956:11: Bye Bye [preauth]
Sep 30 18:22:26 compute-1 sshd-session[280572]: Disconnected from invalid user graylog 194.107.115.65 port 49956 [preauth]
Sep 30 18:22:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:27 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:27 compute-1 nova_compute[238822]: 2025-09-30 18:22:27.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:27 compute-1 sshd-session[280590]: Invalid user laravel from 167.172.43.167 port 50936
Sep 30 18:22:27 compute-1 sshd-session[280590]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:22:27 compute-1 sshd-session[280590]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:22:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:27.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:28 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:28 compute-1 ceph-mon[75484]: pgmap v1315: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:22:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:28.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:29 compute-1 podman[280595]: 2025-09-30 18:22:29.535808334 +0000 UTC m=+0.070497335 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:22:29 compute-1 podman[280594]: 2025-09-30 18:22:29.578289601 +0000 UTC m=+0.110237368 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 18:22:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:29.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:30 compute-1 sshd-session[280590]: Failed password for invalid user laravel from 167.172.43.167 port 50936 ssh2
Sep 30 18:22:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:30 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:30 compute-1 sshd-session[280590]: Received disconnect from 167.172.43.167 port 50936:11: Bye Bye [preauth]
Sep 30 18:22:30 compute-1 sshd-session[280590]: Disconnected from invalid user laravel 167.172.43.167 port 50936 [preauth]
Sep 30 18:22:30 compute-1 ceph-mon[75484]: pgmap v1316: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:22:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:30.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:31 compute-1 nova_compute[238822]: 2025-09-30 18:22:31.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:31 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:31 compute-1 nova_compute[238822]: 2025-09-30 18:22:31.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:31.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:32 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:32 compute-1 ceph-mon[75484]: pgmap v1317: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:22:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:32.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:32 compute-1 nova_compute[238822]: 2025-09-30 18:22:32.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:33 compute-1 sudo[280646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:22:33 compute-1 sudo[280646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:22:33 compute-1 sudo[280646]: pam_unix(sudo:session): session closed for user root
Sep 30 18:22:33 compute-1 nova_compute[238822]: 2025-09-30 18:22:33.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:33 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:33.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:34 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:34 compute-1 ovn_controller[135204]: 2025-09-30T18:22:34Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5c:4d:33 10.100.0.13
Sep 30 18:22:34 compute-1 ovn_controller[135204]: 2025-09-30T18:22:34Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5c:4d:33 10.100.0.13
Sep 30 18:22:34 compute-1 ceph-mon[75484]: pgmap v1318: 353 pgs: 353 active+clean; 188 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 117 op/s
Sep 30 18:22:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:34.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:35 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3235448362' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:22:35 compute-1 nova_compute[238822]: 2025-09-30 18:22:35.580 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:35 compute-1 podman[249638]: time="2025-09-30T18:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:22:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39667 "" "Go-http-client/1.1"
Sep 30 18:22:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9289 "" "Go-http-client/1.1"
Sep 30 18:22:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:35.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:22:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3008231978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:36 compute-1 nova_compute[238822]: 2025-09-30 18:22:36.143 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:36 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:36 compute-1 nova_compute[238822]: 2025-09-30 18:22:36.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3008231978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:36 compute-1 ceph-mon[75484]: pgmap v1319: 353 pgs: 353 active+clean; 188 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 233 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Sep 30 18:22:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:22:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2408598724' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:22:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:22:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2408598724' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:22:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:36.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:37 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.207 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.208 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.419 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.420 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.447 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.448 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4524MB free_disk=39.902034759521484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.449 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.449 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2408598724' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:22:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2408598724' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:22:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:22:37 compute-1 podman[280699]: 2025-09-30 18:22:37.535091614 +0000 UTC m=+0.079422706 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:22:37 compute-1 nova_compute[238822]: 2025-09-30 18:22:37.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:37.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:38 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2919677126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:38 compute-1 ceph-mon[75484]: pgmap v1320: 353 pgs: 353 active+clean; 188 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 233 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Sep 30 18:22:38 compute-1 nova_compute[238822]: 2025-09-30 18:22:38.513 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 6410d0a2-466d-41d9-a863-f756714e17c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:22:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:38.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.021 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 71a2a65c-86a0-4257-9bd1-1cd4e706fb69 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.022 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.022 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:22:37 up  3:59,  0 user,  load average: 0.72, 0.60, 0.90\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_af4ef07c582847419a03275af50c6ffc': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.090 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:22:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:39 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8480001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:22:39 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1698415441' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.617 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.626 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:22:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1698415441' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:22:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:39.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.983 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Creating tmpfile /var/lib/nova/instances/tmp8q7mx0yg to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.985 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:39 compute-1 nova_compute[238822]: 2025-09-30 18:22:39.991 2 DEBUG nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8q7mx0yg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:22:40 compute-1 nova_compute[238822]: 2025-09-30 18:22:40.137 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:22:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:40 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:40.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:40 compute-1 nova_compute[238822]: 2025-09-30 18:22:40.652 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:22:40 compute-1 nova_compute[238822]: 2025-09-30 18:22:40.652 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.203s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:40 compute-1 ceph-mon[75484]: pgmap v1321: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:22:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:41 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8488002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:41 compute-1 nova_compute[238822]: 2025-09-30 18:22:41.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:41.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:42 compute-1 nova_compute[238822]: 2025-09-30 18:22:42.025 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:42 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:42 compute-1 ceph-mon[75484]: pgmap v1322: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:22:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:42.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:42 compute-1 nova_compute[238822]: 2025-09-30 18:22:42.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:43 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:43 compute-1 podman[280753]: 2025-09-30 18:22:43.564072977 +0000 UTC m=+0.083292531 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd)
Sep 30 18:22:43 compute-1 sshd-session[280747]: Invalid user 5 from 175.126.165.170 port 60054
Sep 30 18:22:43 compute-1 sshd-session[280747]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:22:43 compute-1 sshd-session[280747]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:22:43 compute-1 podman[280751]: 2025-09-30 18:22:43.578665531 +0000 UTC m=+0.108951944 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:22:43 compute-1 podman[280752]: 2025-09-30 18:22:43.596572445 +0000 UTC m=+0.120082425 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Sep 30 18:22:43 compute-1 nova_compute[238822]: 2025-09-30 18:22:43.648 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:43 compute-1 nova_compute[238822]: 2025-09-30 18:22:43.649 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:43 compute-1 nova_compute[238822]: 2025-09-30 18:22:43.650 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:43 compute-1 nova_compute[238822]: 2025-09-30 18:22:43.650 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:22:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:43.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:44 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:44 compute-1 ceph-mon[75484]: pgmap v1323: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:22:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:44.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:45 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:45 compute-1 sshd-session[280747]: Failed password for invalid user 5 from 175.126.165.170 port 60054 ssh2
Sep 30 18:22:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:45.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:45 compute-1 nova_compute[238822]: 2025-09-30 18:22:45.990 2 DEBUG nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8q7mx0yg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='71a2a65c-86a0-4257-9bd1-1cd4e706fb69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:22:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:46 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:46 compute-1 ceph-mon[75484]: pgmap v1324: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 94 KiB/s rd, 106 KiB/s wr, 21 op/s
Sep 30 18:22:46 compute-1 nova_compute[238822]: 2025-09-30 18:22:46.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:46 compute-1 sshd-session[280747]: Received disconnect from 175.126.165.170 port 60054:11: Bye Bye [preauth]
Sep 30 18:22:46 compute-1 sshd-session[280747]: Disconnected from invalid user 5 175.126.165.170 port 60054 [preauth]
Sep 30 18:22:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:46.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:47 compute-1 nova_compute[238822]: 2025-09-30 18:22:47.009 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:22:47 compute-1 nova_compute[238822]: 2025-09-30 18:22:47.010 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:22:47 compute-1 nova_compute[238822]: 2025-09-30 18:22:47.010 2 DEBUG nova.network.neutron [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:22:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:47 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:47 compute-1 unix_chkpwd[280822]: password check failed for user (root)
Sep 30 18:22:47 compute-1 sshd-session[280818]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:22:47 compute-1 nova_compute[238822]: 2025-09-30 18:22:47.519 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:47 compute-1 nova_compute[238822]: 2025-09-30 18:22:47.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:47 compute-1 unix_chkpwd[280823]: password check failed for user (root)
Sep 30 18:22:47 compute-1 sshd-session[280816]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:22:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:47.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:48 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:48 compute-1 ceph-mon[75484]: pgmap v1325: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 94 KiB/s rd, 106 KiB/s wr, 21 op/s
Sep 30 18:22:48 compute-1 nova_compute[238822]: 2025-09-30 18:22:48.448 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:48.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:48 compute-1 nova_compute[238822]: 2025-09-30 18:22:48.651 2 DEBUG nova.network.neutron [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Updating instance_info_cache with network_info: [{"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:22:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.157 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.175 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8q7mx0yg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='71a2a65c-86a0-4257-9bd1-1cd4e706fb69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.175 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Creating instance directory: /var/lib/nova/instances/71a2a65c-86a0-4257-9bd1-1cd4e706fb69 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.176 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Ensure instance console log exists: /var/lib/nova/instances/71a2a65c-86a0-4257-9bd1-1cd4e706fb69/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.176 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.177 2 DEBUG nova.virt.libvirt.vif [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:21:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1057470913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1057470913',id=14,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:21:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-1zyfmf1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:21:59Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=71a2a65c-86a0-4257-9bd1-1cd4e706fb69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.177 2 DEBUG nova.network.os_vif_util [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.177 2 DEBUG nova.network.os_vif_util [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.178 2 DEBUG os_vif [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.179 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a8d03729-f345-5c8b-be8a-7b89f1b0e918', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc08e30da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.191 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc08e30da-20, col_values=(('qos', UUID('0eb5c921-32bb-46bd-accd-78457c62bcc9')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc08e30da-20, col_values=(('external_ids', {'iface-id': 'c08e30da-2028-4b45-9b18-b77d81894e93', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:60:7a', 'vm-uuid': '71a2a65c-86a0-4257-9bd1-1cd4e706fb69'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 NetworkManager[45549]: <info>  [1759256569.1951] manager: (tapc08e30da-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.206 2 INFO os_vif [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20')
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.207 2 DEBUG nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.207 2 DEBUG nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8q7mx0yg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='71a2a65c-86a0-4257-9bd1-1cd4e706fb69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.209 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:49 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: ERROR   18:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: ERROR   18:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: ERROR   18:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: ERROR   18:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: ERROR   18:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:22:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:22:49 compute-1 nova_compute[238822]: 2025-09-30 18:22:49.446 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:49 compute-1 sshd-session[280818]: Failed password for root from 216.10.242.161 port 38222 ssh2
Sep 30 18:22:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:49.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:50 compute-1 sshd-session[280816]: Failed password for root from 192.210.160.141 port 41260 ssh2
Sep 30 18:22:50 compute-1 nova_compute[238822]: 2025-09-30 18:22:50.125 2 DEBUG nova.network.neutron [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Port c08e30da-2028-4b45-9b18-b77d81894e93 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:22:50 compute-1 nova_compute[238822]: 2025-09-30 18:22:50.146 2 DEBUG nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8q7mx0yg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='71a2a65c-86a0-4257-9bd1-1cd4e706fb69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:22:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:50 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:50 compute-1 sshd-session[280818]: Received disconnect from 216.10.242.161 port 38222:11: Bye Bye [preauth]
Sep 30 18:22:50 compute-1 sshd-session[280818]: Disconnected from authenticating user root 216.10.242.161 port 38222 [preauth]
Sep 30 18:22:50 compute-1 ceph-mon[75484]: pgmap v1326: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 94 KiB/s rd, 107 KiB/s wr, 21 op/s
Sep 30 18:22:50 compute-1 ovn_controller[135204]: 2025-09-30T18:22:50Z|00122|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 18:22:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:50.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:50 compute-1 sshd-session[280816]: Connection closed by authenticating user root 192.210.160.141 port 41260 [preauth]
Sep 30 18:22:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:51 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:51.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:52 compute-1 sshd-session[280830]: Invalid user seekcy from 14.225.167.110 port 51710
Sep 30 18:22:52 compute-1 sshd-session[280830]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:22:52 compute-1 sshd-session[280830]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:22:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:52 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:22:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:52.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:52 compute-1 nova_compute[238822]: 2025-09-30 18:22:52.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:53 compute-1 kernel: tapc08e30da-20: entered promiscuous mode
Sep 30 18:22:53 compute-1 nova_compute[238822]: 2025-09-30 18:22:53.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:53 compute-1 NetworkManager[45549]: <info>  [1759256573.0950] manager: (tapc08e30da-20): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Sep 30 18:22:53 compute-1 ovn_controller[135204]: 2025-09-30T18:22:53Z|00123|binding|INFO|Claiming lport c08e30da-2028-4b45-9b18-b77d81894e93 for this additional chassis.
Sep 30 18:22:53 compute-1 ovn_controller[135204]: 2025-09-30T18:22:53Z|00124|binding|INFO|c08e30da-2028-4b45-9b18-b77d81894e93: Claiming fa:16:3e:ab:60:7a 10.100.0.12
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.103 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:60:7a 10.100.0.12'], port_security=['fa:16:3e:ab:60:7a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '71a2a65c-86a0-4257-9bd1-1cd4e706fb69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '10', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c08e30da-2028-4b45-9b18-b77d81894e93) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.104 144543 INFO neutron.agent.ovn.metadata.agent [-] Port c08e30da-2028-4b45-9b18-b77d81894e93 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.105 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:22:53 compute-1 ovn_controller[135204]: 2025-09-30T18:22:53Z|00125|binding|INFO|Setting lport c08e30da-2028-4b45-9b18-b77d81894e93 ovn-installed in OVS
Sep 30 18:22:53 compute-1 nova_compute[238822]: 2025-09-30 18:22:53.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.129 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[68db1c8a-969b-4eb7-9ded-f9cf6f0be88f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 sudo[280837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:22:53 compute-1 sudo[280837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:22:53 compute-1 sudo[280837]: pam_unix(sudo:session): session closed for user root
Sep 30 18:22:53 compute-1 systemd-udevd[280875]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:22:53 compute-1 systemd-machined[195911]: New machine qemu-11-instance-0000000e.
Sep 30 18:22:53 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Sep 30 18:22:53 compute-1 NetworkManager[45549]: <info>  [1759256573.1809] device (tapc08e30da-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.179 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[396e5db3-7aee-42e5-8fc3-9be2add88a3a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 NetworkManager[45549]: <info>  [1759256573.1824] device (tapc08e30da-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.186 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[57a6bf39-6f67-4563-a6ed-2a4edc53eb9f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:53 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.227 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[65cce8d4-1786-4145-98d6-1f6144a14f3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.248 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[41ccff4c-25ea-4445-a85a-96632cbb4ff6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1438218, 'reachable_time': 19300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280885, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.270 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2e37a9d3-cd51-4621-a2a9-8f90f95c99d3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1438233, 'tstamp': 1438233}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280887, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1438237, 'tstamp': 1438237}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280887, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.272 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:53 compute-1 nova_compute[238822]: 2025-09-30 18:22:53.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:53 compute-1 nova_compute[238822]: 2025-09-30 18:22:53.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.275 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.276 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.276 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.276 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:22:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:53.278 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b04a3ced-ee31-4fd6-aa56-d00054bd3c95]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:22:53 compute-1 ceph-mon[75484]: pgmap v1327: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:22:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:53 compute-1 sshd-session[280830]: Failed password for invalid user seekcy from 14.225.167.110 port 51710 ssh2
Sep 30 18:22:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:53.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:54 compute-1 sudo[280930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:22:54 compute-1 sudo[280930]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:22:54 compute-1 sudo[280930]: pam_unix(sudo:session): session closed for user root
Sep 30 18:22:54 compute-1 sudo[280956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:22:54 compute-1 sudo[280956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:22:54 compute-1 nova_compute[238822]: 2025-09-30 18:22:54.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:54 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:54 compute-1 sshd-session[280830]: Received disconnect from 14.225.167.110 port 51710:11: Bye Bye [preauth]
Sep 30 18:22:54 compute-1 sshd-session[280830]: Disconnected from invalid user seekcy 14.225.167.110 port 51710 [preauth]
Sep 30 18:22:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:54.376 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:22:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:54.377 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:22:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:22:54.377 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:22:54 compute-1 ceph-mon[75484]: pgmap v1328: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 17 KiB/s wr, 1 op/s
Sep 30 18:22:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:54.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:54 compute-1 sudo[280956]: pam_unix(sudo:session): session closed for user root
Sep 30 18:22:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:55 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:55.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:56 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:56 compute-1 ceph-mon[75484]: pgmap v1329: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:22:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:56.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:57 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:57 compute-1 ovn_controller[135204]: 2025-09-30T18:22:57Z|00126|binding|INFO|Claiming lport c08e30da-2028-4b45-9b18-b77d81894e93 for this chassis.
Sep 30 18:22:57 compute-1 ovn_controller[135204]: 2025-09-30T18:22:57Z|00127|binding|INFO|c08e30da-2028-4b45-9b18-b77d81894e93: Claiming fa:16:3e:ab:60:7a 10.100.0.12
Sep 30 18:22:57 compute-1 ovn_controller[135204]: 2025-09-30T18:22:57Z|00128|binding|INFO|Setting lport c08e30da-2028-4b45-9b18-b77d81894e93 up in Southbound
Sep 30 18:22:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3834446620' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:22:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3834446620' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:22:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:22:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:22:57 compute-1 nova_compute[238822]: 2025-09-30 18:22:57.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:22:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:57.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:22:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:58 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.331 2 INFO nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Post operation of migration started
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.332 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:22:58 compute-1 ceph-mon[75484]: pgmap v1330: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:22:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:22:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:22:58.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.671 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.672 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:22:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.830 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.831 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:22:58 compute-1 nova_compute[238822]: 2025-09-30 18:22:58.831 2 DEBUG nova.network.neutron [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:22:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:59 compute-1 nova_compute[238822]: 2025-09-30 18:22:59.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:22:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:22:59 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:22:59 compute-1 nova_compute[238822]: 2025-09-30 18:22:59.338 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:59 compute-1 nova_compute[238822]: 2025-09-30 18:22:59.762 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:22:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:22:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:22:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:22:59 compute-1 nova_compute[238822]: 2025-09-30 18:22:59.964 2 DEBUG nova.network.neutron [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Updating instance_info_cache with network_info: [{"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:22:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:22:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:22:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:22:59.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:00 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:00 compute-1 ceph-mon[75484]: pgmap v1331: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 2.1 KiB/s wr, 5 op/s
Sep 30 18:23:00 compute-1 nova_compute[238822]: 2025-09-30 18:23:00.471 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-71a2a65c-86a0-4257-9bd1-1cd4e706fb69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:23:00 compute-1 podman[281019]: 2025-09-30 18:23:00.573738723 +0000 UTC m=+0.100665390 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:23:00 compute-1 podman[281018]: 2025-09-30 18:23:00.610974268 +0000 UTC m=+0.141393420 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Sep 30 18:23:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:00.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:00 compute-1 nova_compute[238822]: 2025-09-30 18:23:00.991 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:00 compute-1 nova_compute[238822]: 2025-09-30 18:23:00.992 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:00 compute-1 nova_compute[238822]: 2025-09-30 18:23:00.993 2 DEBUG oslo_concurrency.lockutils [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:01 compute-1 nova_compute[238822]: 2025-09-30 18:23:01.000 2 INFO nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:23:01 compute-1 virtqemud[239124]: Domain id=11 name='instance-0000000e' uuid=71a2a65c-86a0-4257-9bd1-1cd4e706fb69 is tainted: custom-monitor
Sep 30 18:23:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:01 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:01.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:02 compute-1 nova_compute[238822]: 2025-09-30 18:23:02.011 2 INFO nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:23:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:02 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:02 compute-1 ceph-mon[75484]: pgmap v1332: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Sep 30 18:23:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:02.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:02 compute-1 nova_compute[238822]: 2025-09-30 18:23:02.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:03 compute-1 nova_compute[238822]: 2025-09-30 18:23:03.019 2 INFO nova.virt.libvirt.driver [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:23:03 compute-1 nova_compute[238822]: 2025-09-30 18:23:03.025 2 DEBUG nova.compute.manager [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:23:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:03 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:03 compute-1 nova_compute[238822]: 2025-09-30 18:23:03.538 2 DEBUG nova.objects.instance [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:23:03 compute-1 sudo[281073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:23:03 compute-1 sudo[281073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:23:03 compute-1 sudo[281073]: pam_unix(sudo:session): session closed for user root
Sep 30 18:23:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:04 compute-1 nova_compute[238822]: 2025-09-30 18:23:04.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:04 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0002ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:23:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:23:04 compute-1 nova_compute[238822]: 2025-09-30 18:23:04.556 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:04.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:05 compute-1 nova_compute[238822]: 2025-09-30 18:23:05.226 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:05 compute-1 nova_compute[238822]: 2025-09-30 18:23:05.226 2 WARNING neutronclient.v2_0.client [None req-133c1b82-b925-4555-94b9-892a28b8d3a1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:05 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac003020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:05 compute-1 ceph-mon[75484]: pgmap v1333: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 1.1 KiB/s wr, 5 op/s
Sep 30 18:23:05 compute-1 podman[249638]: time="2025-09-30T18:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:23:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 39667 "" "Go-http-client/1.1"
Sep 30 18:23:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 9292 "" "Go-http-client/1.1"
Sep 30 18:23:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:05.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:06 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:06 compute-1 ceph-mon[75484]: pgmap v1334: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 85 B/s wr, 5 op/s
Sep 30 18:23:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:06.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:07 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.389 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.390 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.390 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.390 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.391 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.408 2 INFO nova.compute.manager [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Terminating instance
Sep 30 18:23:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:23:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2209030715' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:07 compute-1 sshd-session[281101]: Invalid user minecraft from 84.51.43.58 port 58540
Sep 30 18:23:07 compute-1 sshd-session[281101]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:23:07 compute-1 sshd-session[281101]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:23:07 compute-1 podman[281104]: 2025-09-30 18:23:07.689314431 +0000 UTC m=+0.090908647 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:07 compute-1 nova_compute[238822]: 2025-09-30 18:23:07.942 2 DEBUG nova.compute.manager [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:23:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:07.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:08 compute-1 kernel: tap9230ef6a-67 (unregistering): left promiscuous mode
Sep 30 18:23:08 compute-1 NetworkManager[45549]: <info>  [1759256588.0085] device (tap9230ef6a-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 ovn_controller[135204]: 2025-09-30T18:23:08Z|00129|binding|INFO|Releasing lport 9230ef6a-67da-4be6-9a8f-3fa248015ba4 from this chassis (sb_readonly=0)
Sep 30 18:23:08 compute-1 ovn_controller[135204]: 2025-09-30T18:23:08Z|00130|binding|INFO|Setting lport 9230ef6a-67da-4be6-9a8f-3fa248015ba4 down in Southbound
Sep 30 18:23:08 compute-1 ovn_controller[135204]: 2025-09-30T18:23:08Z|00131|binding|INFO|Removing iface tap9230ef6a-67 ovn-installed in OVS
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.026 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:4d:33 10.100.0.13'], port_security=['fa:16:3e:5c:4d:33 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6410d0a2-466d-41d9-a863-f756714e17c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=9230ef6a-67da-4be6-9a8f-3fa248015ba4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.028 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 9230ef6a-67da-4be6-9a8f-3fa248015ba4 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.031 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 443be7ca-f628-4a45-95b6-620d37172d7b
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.056 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e09693cf-a074-4029-a40a-08ad6c9ea311]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Sep 30 18:23:08 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Consumed 15.449s CPU time.
Sep 30 18:23:08 compute-1 systemd-machined[195911]: Machine qemu-10-instance-0000000f terminated.
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.110 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c48b3236-737c-4ab4-94a7-e4d914339f95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.115 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[51620c05-e08c-47a2-9bfc-5587c14de8f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.157 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2e3517-2596-426a-855a-4c51051b3365]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.182 2 DEBUG nova.compute.manager [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.183 2 DEBUG oslo_concurrency.lockutils [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.183 2 DEBUG oslo_concurrency.lockutils [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.184 2 DEBUG oslo_concurrency.lockutils [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.184 2 DEBUG nova.compute.manager [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] No waiting events found dispatching network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.184 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8926e8f1-fe83-4cf3-bd07-ad64a7fe87ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap443be7ca-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:7f:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1438218, 'reachable_time': 19300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281138, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.185 2 DEBUG nova.compute.manager [req-4d9c64d7-0736-44c9-a563-059cd1d62f47 req-c08a3f2a-3696-43ec-807e-978b4f528a5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.195 2 INFO nova.virt.libvirt.driver [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Instance destroyed successfully.
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.196 2 DEBUG nova.objects.instance [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'resources' on Instance uuid 6410d0a2-466d-41d9-a863-f756714e17c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:23:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:08 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84a0002ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.216 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[77d14d52-aa06-40f9-8ee2-b1cc26c69972]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1438233, 'tstamp': 1438233}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281146, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap443be7ca-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1438237, 'tstamp': 1438237}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281146, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.218 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.228 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443be7ca-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.229 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.229 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap443be7ca-f0, col_values=(('external_ids', {'iface-id': '031d2cff-b142-4423-ba99-772183b7a667'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.229 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:23:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:08.231 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6ebbbd-f91b-48b0-9129-1fee096f848e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-443be7ca-f628-4a45-95b6-620d37172d7b\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 443be7ca-f628-4a45-95b6-620d37172d7b\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:08 compute-1 ceph-mon[75484]: pgmap v1335: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 85 B/s wr, 5 op/s
Sep 30 18:23:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:08.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.703 2 DEBUG nova.virt.libvirt.vif [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:22:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1149902309',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1149902309',id=15,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:22:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-0z3b6qta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:22:22Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=6410d0a2-466d-41d9-a863-f756714e17c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.704 2 DEBUG nova.network.os_vif_util [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "address": "fa:16:3e:5c:4d:33", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9230ef6a-67", "ovs_interfaceid": "9230ef6a-67da-4be6-9a8f-3fa248015ba4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.704 2 DEBUG nova.network.os_vif_util [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.705 2 DEBUG os_vif [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9230ef6a-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.713 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=05bca08c-12a3-4a6f-8bb4-4050c8a44dc5) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:08 compute-1 nova_compute[238822]: 2025-09-30 18:23:08.718 2 INFO os_vif [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:4d:33,bridge_name='br-int',has_traffic_filtering=True,id=9230ef6a-67da-4be6-9a8f-3fa248015ba4,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9230ef6a-67')
Sep 30 18:23:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.169 2 INFO nova.virt.libvirt.driver [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Deleting instance files /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5_del
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.172 2 INFO nova.virt.libvirt.driver [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Deletion of /var/lib/nova/instances/6410d0a2-466d-41d9-a863-f756714e17c5_del complete
Sep 30 18:23:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:09 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac0041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1217540508' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.687 2 INFO nova.compute.manager [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Took 1.74 seconds to destroy the instance on the hypervisor.
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.687 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.688 2 DEBUG nova.compute.manager [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.688 2 DEBUG nova.network.neutron [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:23:09 compute-1 nova_compute[238822]: 2025-09-30 18:23:09.688 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:09 compute-1 sshd-session[281101]: Failed password for invalid user minecraft from 84.51.43.58 port 58540 ssh2
Sep 30 18:23:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:10.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:10 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.223 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.253 2 DEBUG nova.compute.manager [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.254 2 DEBUG oslo_concurrency.lockutils [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.254 2 DEBUG oslo_concurrency.lockutils [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.254 2 DEBUG oslo_concurrency.lockutils [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.255 2 DEBUG nova.compute.manager [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] No waiting events found dispatching network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.255 2 DEBUG nova.compute.manager [req-3bb0a133-3433-4a2f-820f-de511d874d78 req-e735c64e-e109-4dd9-9bf4-bc27d5559ab8 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-unplugged-9230ef6a-67da-4be6-9a8f-3fa248015ba4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:23:10 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:10.408 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:10 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:10.410 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.503 2 DEBUG nova.compute.manager [req-d9f272df-9fff-4d1a-a498-8f5ee122da01 req-173c4724-7678-48fd-8a48-beae66391815 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Received event network-vif-deleted-9230ef6a-67da-4be6-9a8f-3fa248015ba4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.504 2 INFO nova.compute.manager [req-d9f272df-9fff-4d1a-a498-8f5ee122da01 req-173c4724-7678-48fd-8a48-beae66391815 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Neutron deleted interface 9230ef6a-67da-4be6-9a8f-3fa248015ba4; detaching it from the instance and deleting it from the info cache
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.504 2 DEBUG nova.network.neutron [req-d9f272df-9fff-4d1a-a498-8f5ee122da01 req-173c4724-7678-48fd-8a48-beae66391815 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:23:10 compute-1 ceph-mon[75484]: pgmap v1336: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 2.4 KiB/s wr, 6 op/s
Sep 30 18:23:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:10 compute-1 nova_compute[238822]: 2025-09-30 18:23:10.955 2 DEBUG nova.network.neutron [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:23:11 compute-1 nova_compute[238822]: 2025-09-30 18:23:11.012 2 DEBUG nova.compute.manager [req-d9f272df-9fff-4d1a-a498-8f5ee122da01 req-173c4724-7678-48fd-8a48-beae66391815 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Detach interface failed, port_id=9230ef6a-67da-4be6-9a8f-3fa248015ba4, reason: Instance 6410d0a2-466d-41d9-a863-f756714e17c5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:23:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:11 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:11 compute-1 nova_compute[238822]: 2025-09-30 18:23:11.463 2 INFO nova.compute.manager [-] [instance: 6410d0a2-466d-41d9-a863-f756714e17c5] Took 1.78 seconds to deallocate network for instance.
Sep 30 18:23:11 compute-1 sshd-session[281101]: Received disconnect from 84.51.43.58 port 58540:11: Bye Bye [preauth]
Sep 30 18:23:11 compute-1 sshd-session[281101]: Disconnected from invalid user minecraft 84.51.43.58 port 58540 [preauth]
Sep 30 18:23:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:23:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:12.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.020 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.021 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.087 2 DEBUG oslo_concurrency.processutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:23:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:12 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:12.413 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:12 compute-1 ceph-mon[75484]: pgmap v1337: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:23:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:23:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3217688103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.634 2 DEBUG oslo_concurrency.processutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.643 2 DEBUG nova.compute.provider_tree [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:23:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:12.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:12 compute-1 nova_compute[238822]: 2025-09-30 18:23:12.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:13 compute-1 nova_compute[238822]: 2025-09-30 18:23:13.156 2 DEBUG nova.scheduler.client.report [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:23:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:13 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84ac004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:13 compute-1 sudo[281198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:23:13 compute-1 sudo[281198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:23:13 compute-1 sudo[281198]: pam_unix(sudo:session): session closed for user root
Sep 30 18:23:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3217688103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:13 compute-1 nova_compute[238822]: 2025-09-30 18:23:13.669 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.648s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:13 compute-1 nova_compute[238822]: 2025-09-30 18:23:13.698 2 INFO nova.scheduler.client.report [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Deleted allocations for instance 6410d0a2-466d-41d9-a863-f756714e17c5
Sep 30 18:23:13 compute-1 nova_compute[238822]: 2025-09-30 18:23:13.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:13 compute-1 unix_chkpwd[281223]: password check failed for user (root)
Sep 30 18:23:13 compute-1 sshd-session[281174]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:23:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:14.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:14 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:14 compute-1 ceph-mon[75484]: pgmap v1338: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:23:14 compute-1 podman[281227]: 2025-09-30 18:23:14.539571148 +0000 UTC m=+0.067402012 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41)
Sep 30 18:23:14 compute-1 podman[281226]: 2025-09-30 18:23:14.547633266 +0000 UTC m=+0.076898078 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:23:14 compute-1 podman[281228]: 2025-09-30 18:23:14.549108425 +0000 UTC m=+0.074076811 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4)
Sep 30 18:23:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:14.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:14 compute-1 nova_compute[238822]: 2025-09-30 18:23:14.730 2 DEBUG oslo_concurrency.lockutils [None req-afcf1db2-b212-431f-b29d-53963d110af8 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "6410d0a2-466d-41d9-a863-f756714e17c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.341s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.041 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.042 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.042 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.043 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.043 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.057 2 INFO nova.compute.manager [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Terminating instance
Sep 30 18:23:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:15 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:15 compute-1 sshd-session[281174]: Failed password for root from 192.210.160.141 port 43184 ssh2
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.576 2 DEBUG nova.compute.manager [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:23:15 compute-1 kernel: tapc08e30da-20 (unregistering): left promiscuous mode
Sep 30 18:23:15 compute-1 NetworkManager[45549]: <info>  [1759256595.6575] device (tapc08e30da-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:23:15 compute-1 ovn_controller[135204]: 2025-09-30T18:23:15Z|00132|binding|INFO|Releasing lport c08e30da-2028-4b45-9b18-b77d81894e93 from this chassis (sb_readonly=0)
Sep 30 18:23:15 compute-1 ovn_controller[135204]: 2025-09-30T18:23:15Z|00133|binding|INFO|Setting lport c08e30da-2028-4b45-9b18-b77d81894e93 down in Southbound
Sep 30 18:23:15 compute-1 ovn_controller[135204]: 2025-09-30T18:23:15Z|00134|binding|INFO|Removing iface tapc08e30da-20 ovn-installed in OVS
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:15.677 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:60:7a 10.100.0.12'], port_security=['fa:16:3e:ab:60:7a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '71a2a65c-86a0-4257-9bd1-1cd4e706fb69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-443be7ca-f628-4a45-95b6-620d37172d7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af4ef07c582847419a03275af50c6ffc', 'neutron:revision_number': '14', 'neutron:security_group_ids': '518a9c00-28f9-47ab-a122-e672192eedea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=96eb21b8-879c-4e72-963b-37e37ae3d0c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=c08e30da-2028-4b45-9b18-b77d81894e93) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:23:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:15.678 144543 INFO neutron.agent.ovn.metadata.agent [-] Port c08e30da-2028-4b45-9b18-b77d81894e93 in datapath 443be7ca-f628-4a45-95b6-620d37172d7b unbound from our chassis
Sep 30 18:23:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:15.680 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 443be7ca-f628-4a45-95b6-620d37172d7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:23:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:15.681 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[89c4f198-8bf4-4bc4-b668-a8f08d4af0e6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:15.682 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b namespace which is not needed anymore
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:15 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Sep 30 18:23:15 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 3.054s CPU time.
Sep 30 18:23:15 compute-1 systemd-machined[195911]: Machine qemu-11-instance-0000000e terminated.
Sep 30 18:23:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.829 2 INFO nova.virt.libvirt.driver [-] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Instance destroyed successfully.
Sep 30 18:23:15 compute-1 nova_compute[238822]: 2025-09-30 18:23:15.830 2 DEBUG nova.objects.instance [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lazy-loading 'resources' on Instance uuid 71a2a65c-86a0-4257-9bd1-1cd4e706fb69 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:23:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:15 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [NOTICE]   (280570) : haproxy version is 3.0.5-8e879a5
Sep 30 18:23:15 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [NOTICE]   (280570) : path to executable is /usr/sbin/haproxy
Sep 30 18:23:15 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [WARNING]  (280570) : Exiting Master process...
Sep 30 18:23:15 compute-1 podman[281313]: 2025-09-30 18:23:15.887924833 +0000 UTC m=+0.055935711 container kill 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Sep 30 18:23:15 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [ALERT]    (280570) : Current worker (280574) exited with code 143 (Terminated)
Sep 30 18:23:15 compute-1 neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b[280566]: [WARNING]  (280570) : All workers exited. Exiting... (0)
Sep 30 18:23:15 compute-1 systemd[1]: libpod-12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236.scope: Deactivated successfully.
Sep 30 18:23:15 compute-1 podman[281337]: 2025-09-30 18:23:15.944746188 +0000 UTC m=+0.035020167 container died 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 18:23:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236-userdata-shm.mount: Deactivated successfully.
Sep 30 18:23:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-6bfb28e0460be9d4a72e54976bda0b56781c84f88c96c65749caf3162a95ce07-merged.mount: Deactivated successfully.
Sep 30 18:23:16 compute-1 podman[281337]: 2025-09-30 18:23:16.003856735 +0000 UTC m=+0.094130714 container cleanup 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 18:23:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:16.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:16 compute-1 systemd[1]: libpod-conmon-12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236.scope: Deactivated successfully.
Sep 30 18:23:16 compute-1 podman[281344]: 2025-09-30 18:23:16.035060037 +0000 UTC m=+0.094581945 container remove 12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.046 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[996990ff-62bf-4969-97fd-47debcd18d89]: (4, ("Tue Sep 30 06:23:15 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b (12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236)\n12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236\nTue Sep 30 06:23:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b (12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236)\n12d176292769d8f3024e08da63834ff15cd23c70011a964f811116d371249236\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.048 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[00a89a88-b54f-4359-a27a-bb0c23388559]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.050 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/443be7ca-f628-4a45-95b6-620d37172d7b.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.051 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d55908-91c1-47be-9f9b-610a4e110bd3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.052 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443be7ca-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 kernel: tap443be7ca-f0: left promiscuous mode
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.098 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[36679464-2300-495a-b95f-c61e07c18188]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.123 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[477b482b-7d1d-464e-ac18-4c8c02e14f7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.124 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[18e713a6-eef9-4c7f-972c-6573efd99cdc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.152 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[36f496f5-153c-4ff7-b348-a42e2b4184ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1438207, 'reachable_time': 30079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281373, 'error': None, 'target': 'ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.158 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-443be7ca-f628-4a45-95b6-620d37172d7b deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:23:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:16.158 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc55663-8bdc-4656-94e1-9f9468ebf07a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:16 compute-1 systemd[1]: run-netns-ovnmeta\x2d443be7ca\x2df628\x2d4a45\x2d95b6\x2d620d37172d7b.mount: Deactivated successfully.
Sep 30 18:23:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:16 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.327 2 DEBUG nova.compute.manager [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Received event network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.328 2 DEBUG oslo_concurrency.lockutils [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.328 2 DEBUG oslo_concurrency.lockutils [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.329 2 DEBUG oslo_concurrency.lockutils [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.329 2 DEBUG nova.compute.manager [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] No waiting events found dispatching network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.329 2 DEBUG nova.compute.manager [req-cd8523fa-4039-4f75-b0e7-75cc66de6429 req-8dbb4d81-5ac6-4e9b-97a7-054637fc5533 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Received event network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.340 2 DEBUG nova.virt.libvirt.vif [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:21:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1057470913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1057470913',id=14,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:21:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af4ef07c582847419a03275af50c6ffc',ramdisk_id='',reservation_id='r-1zyfmf1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1597156537-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:23:04Z,user_data=None,user_id='57be6c3d2e0d431dae0127ac659de1e0',uuid=71a2a65c-86a0-4257-9bd1-1cd4e706fb69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.341 2 DEBUG nova.network.os_vif_util [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converting VIF {"id": "c08e30da-2028-4b45-9b18-b77d81894e93", "address": "fa:16:3e:ab:60:7a", "network": {"id": "443be7ca-f628-4a45-95b6-620d37172d7b", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1888091317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "269f60e72ce1460a98da519466c89da6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc08e30da-20", "ovs_interfaceid": "c08e30da-2028-4b45-9b18-b77d81894e93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.342 2 DEBUG nova.network.os_vif_util [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.342 2 DEBUG os_vif [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.350 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc08e30da-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.356 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0eb5c921-32bb-46bd-accd-78457c62bcc9) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:16 compute-1 nova_compute[238822]: 2025-09-30 18:23:16.362 2 INFO os_vif [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:60:7a,bridge_name='br-int',has_traffic_filtering=True,id=c08e30da-2028-4b45-9b18-b77d81894e93,network=Network(443be7ca-f628-4a45-95b6-620d37172d7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc08e30da-20')
Sep 30 18:23:16 compute-1 ceph-mon[75484]: pgmap v1339: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:23:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:16.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.009 2 INFO nova.virt.libvirt.driver [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Deleting instance files /var/lib/nova/instances/71a2a65c-86a0-4257-9bd1-1cd4e706fb69_del
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.010 2 INFO nova.virt.libvirt.driver [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Deletion of /var/lib/nova/instances/71a2a65c-86a0-4257-9bd1-1cd4e706fb69_del complete
Sep 30 18:23:17 compute-1 sshd-session[281174]: Connection closed by authenticating user root 192.210.160.141 port 43184 [preauth]
Sep 30 18:23:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:17 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84880026d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.529 2 INFO nova.compute.manager [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Took 1.95 seconds to destroy the instance on the hypervisor.
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.530 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.531 2 DEBUG nova.compute.manager [-] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.531 2 DEBUG nova.network.neutron [-] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.531 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:17 compute-1 nova_compute[238822]: 2025-09-30 18:23:17.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:18.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:18 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.233 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.374 2 DEBUG nova.compute.manager [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Received event network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.375 2 DEBUG oslo_concurrency.lockutils [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.375 2 DEBUG oslo_concurrency.lockutils [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.375 2 DEBUG oslo_concurrency.lockutils [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.376 2 DEBUG nova.compute.manager [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] No waiting events found dispatching network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:23:18 compute-1 nova_compute[238822]: 2025-09-30 18:23:18.376 2 DEBUG nova.compute.manager [req-0a179031-b5e7-4659-9bee-e6edc0ab22d7 req-9148cfde-1bcf-4db1-b460-c389ec8a643e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Received event network-vif-unplugged-c08e30da-2028-4b45-9b18-b77d81894e93 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:23:18 compute-1 ceph-mon[75484]: pgmap v1340: 353 pgs: 353 active+clean; 121 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:23:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:18.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:19 compute-1 nova_compute[238822]: 2025-09-30 18:23:19.001 2 DEBUG nova.network.neutron [-] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:23:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:19 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: ERROR   18:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: ERROR   18:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: ERROR   18:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: ERROR   18:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: ERROR   18:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:23:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:23:19 compute-1 nova_compute[238822]: 2025-09-30 18:23:19.509 2 INFO nova.compute.manager [-] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Took 1.98 seconds to deallocate network for instance.
Sep 30 18:23:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:20 compute-1 nova_compute[238822]: 2025-09-30 18:23:20.035 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:20 compute-1 nova_compute[238822]: 2025-09-30 18:23:20.036 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:20 compute-1 nova_compute[238822]: 2025-09-30 18:23:20.042 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:20 compute-1 nova_compute[238822]: 2025-09-30 18:23:20.082 2 INFO nova.scheduler.client.report [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Deleted allocations for instance 71a2a65c-86a0-4257-9bd1-1cd4e706fb69
Sep 30 18:23:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:20 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:20 compute-1 ceph-mon[75484]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 13 KiB/s wr, 57 op/s
Sep 30 18:23:20 compute-1 nova_compute[238822]: 2025-09-30 18:23:20.458 2 DEBUG nova.compute.manager [req-15accd49-f4f4-4f0f-97d0-375229796e92 req-5b4fc6f6-0a91-4043-af6d-168b9329ee4c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 71a2a65c-86a0-4257-9bd1-1cd4e706fb69] Received event network-vif-deleted-c08e30da-2028-4b45-9b18-b77d81894e93 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:23:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:20.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:21 compute-1 nova_compute[238822]: 2025-09-30 18:23:21.118 2 DEBUG oslo_concurrency.lockutils [None req-59d5f432-40d9-4c31-bb36-756d8604be34 57be6c3d2e0d431dae0127ac659de1e0 af4ef07c582847419a03275af50c6ffc - - default default] Lock "71a2a65c-86a0-4257-9bd1-1cd4e706fb69" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:21 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:21 compute-1 nova_compute[238822]: 2025-09-30 18:23:21.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:22 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:23:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:22.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:22 compute-1 nova_compute[238822]: 2025-09-30 18:23:22.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:23 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:23 compute-1 ceph-mon[75484]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:23:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:24 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:24 compute-1 ceph-mon[75484]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:23:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:24.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:25 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84880026d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:25 compute-1 nova_compute[238822]: 2025-09-30 18:23:25.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:26.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:26 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:26 compute-1 nova_compute[238822]: 2025-09-30 18:23:26.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:26 compute-1 ceph-mon[75484]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:23:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:26.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:27 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:27 compute-1 nova_compute[238822]: 2025-09-30 18:23:27.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:28.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:28 compute-1 nova_compute[238822]: 2025-09-30 18:23:28.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:28 compute-1 nova_compute[238822]: 2025-09-30 18:23:28.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:23:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:28 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:28 compute-1 ceph-mon[75484]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:23:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:28.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:29 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:30.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:30 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:30 compute-1 ceph-mon[75484]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:23:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:30.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:31 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84800036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:31 compute-1 nova_compute[238822]: 2025-09-30 18:23:31.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:31 compute-1 podman[281412]: 2025-09-30 18:23:31.52501096 +0000 UTC m=+0.066244760 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:23:31 compute-1 podman[281411]: 2025-09-30 18:23:31.587840287 +0000 UTC m=+0.132514100 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:23:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:32.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:32 compute-1 unix_chkpwd[281457]: password check failed for user (root)
Sep 30 18:23:32 compute-1 sshd-session[281409]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=194.107.115.65  user=root
Sep 30 18:23:32 compute-1 nova_compute[238822]: 2025-09-30 18:23:32.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:32 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84880020f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:32 compute-1 ceph-mon[75484]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:32.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:32 compute-1 nova_compute[238822]: 2025-09-30 18:23:32.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:33 compute-1 nova_compute[238822]: 2025-09-30 18:23:33.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:33 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Sep 30 18:23:33 compute-1 sudo[281460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:23:33 compute-1 sudo[281460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:23:33 compute-1 sudo[281460]: pam_unix(sudo:session): session closed for user root
Sep 30 18:23:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:34.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:34 compute-1 nova_compute[238822]: 2025-09-30 18:23:34.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:34 compute-1 sshd-session[281409]: Failed password for root from 194.107.115.65 port 17926 ssh2
Sep 30 18:23:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg[279008]: 30/09/2025 18:23:34 : epoch 68dc1fa9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003db0 fd 38 proxy ignored for local
Sep 30 18:23:34 compute-1 kernel: ganesha.nfsd[279177]: segfault at 50 ip 00007f855e7e332e sp 00007f8517ffe210 error 4 in libntirpc.so.5.8[7f855e7c8000+2c000] likely on CPU 0 (core 0, socket 0)
Sep 30 18:23:34 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Sep 30 18:23:34 compute-1 systemd[1]: Started Process Core Dump (PID 281486/UID 0).
Sep 30 18:23:34 compute-1 ceph-mon[75484]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 341 B/s rd, 0 op/s
Sep 30 18:23:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:34.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:34 compute-1 sshd-session[281409]: Received disconnect from 194.107.115.65 port 17926:11: Bye Bye [preauth]
Sep 30 18:23:34 compute-1 sshd-session[281409]: Disconnected from authenticating user root 194.107.115.65 port 17926 [preauth]
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3386502316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:35 compute-1 systemd-coredump[281487]: Process 279012 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 52:
                                                    #0  0x00007f855e7e332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:23:35 compute-1 nova_compute[238822]: 2025-09-30 18:23:35.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:23:35 compute-1 systemd[1]: systemd-coredump@13-281486-0.service: Deactivated successfully.
Sep 30 18:23:35 compute-1 systemd[1]: systemd-coredump@13-281486-0.service: Consumed 1.246s CPU time.
Sep 30 18:23:35 compute-1 podman[249638]: time="2025-09-30T18:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:23:35 compute-1 podman[281494]: 2025-09-30 18:23:35.648354045 +0000 UTC m=+0.038157512 container died c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Sep 30 18:23:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-24343fc14e1d66f24f030c759105dc2bf3925fa5050c074fb2434d26d0cfbbfd-merged.mount: Deactivated successfully.
Sep 30 18:23:35 compute-1 podman[281494]: 2025-09-30 18:23:35.722020314 +0000 UTC m=+0.111823831 container remove c4878726e70bfde0a701d3cab4ed3c428775ea55102c78e2208dd6046355746d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-nfs-cephfs-0-0-compute-1-bsnzkg, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 18:23:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 38443 "" "Go-http-client/1.1"
Sep 30 18:23:35 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Main process exited, code=exited, status=139/n/a
Sep 30 18:23:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8349 "" "Go-http-client/1.1"
Sep 30 18:23:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:35 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:23:36 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.980s CPU time.
Sep 30 18:23:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:36.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:23:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/348427045' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.085 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.334 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.337 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.379 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.379 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4748MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.380 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:36 compute-1 nova_compute[238822]: 2025-09-30 18:23:36.380 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/348427045' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:36 compute-1 ceph-mon[75484]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:23:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4203769896' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:23:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:23:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4203769896' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:23:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:36.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.430 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.431 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:23:36 up  4:00,  0 user,  load average: 0.34, 0.50, 0.85\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.454 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:23:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4203769896' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:23:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4203769896' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:23:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3754555794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:23:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:37.590 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:1b:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-275a3eeb-ef50-4b9a-853e-ab955980469b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-275a3eeb-ef50-4b9a-853e-ab955980469b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c53b06039bf4f348ffe63a9201c8e5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c860265-d2f7-41ef-be31-8eb602810d53, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c620806e-4b76-4583-9d88-892c2b24da08) old=Port_Binding(mac=['fa:16:3e:bb:1b:1b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-275a3eeb-ef50-4b9a-853e-ab955980469b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-275a3eeb-ef50-4b9a-853e-ab955980469b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0c53b06039bf4f348ffe63a9201c8e5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:23:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:37.591 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c620806e-4b76-4583-9d88-892c2b24da08 in datapath 275a3eeb-ef50-4b9a-853e-ab955980469b updated
Sep 30 18:23:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:37.592 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 275a3eeb-ef50-4b9a-853e-ab955980469b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:23:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:37.593 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[96815f6e-aea1-4ac4-8766-c070b02c1eea]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:23:37 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4159245570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.910 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:23:37 compute-1 nova_compute[238822]: 2025-09-30 18:23:37.916 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:23:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:38.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:38 compute-1 nova_compute[238822]: 2025-09-30 18:23:38.425 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.477006) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618477092, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2347, "num_deletes": 251, "total_data_size": 5770543, "memory_usage": 5830352, "flush_reason": "Manual Compaction"}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Sep 30 18:23:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4159245570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:23:38 compute-1 ceph-mon[75484]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618502123, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3737815, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36526, "largest_seqno": 38868, "table_properties": {"data_size": 3728541, "index_size": 5768, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19633, "raw_average_key_size": 20, "raw_value_size": 3709833, "raw_average_value_size": 3844, "num_data_blocks": 252, "num_entries": 965, "num_filter_entries": 965, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256413, "oldest_key_time": 1759256413, "file_creation_time": 1759256618, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 25183 microseconds, and 14287 cpu microseconds.
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.502195) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3737815 bytes OK
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.502226) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.504381) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.504402) EVENT_LOG_v1 {"time_micros": 1759256618504395, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.504425) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5760127, prev total WAL file size 5760127, number of live WAL files 2.
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.507216) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3650KB)], [69(10206KB)]
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618507281, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 14189722, "oldest_snapshot_seqno": -1}
Sep 30 18:23:38 compute-1 podman[281586]: 2025-09-30 18:23:38.565402409 +0000 UTC m=+0.102177341 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6378 keys, 12282613 bytes, temperature: kUnknown
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618572530, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12282613, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12241807, "index_size": 23736, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16005, "raw_key_size": 163934, "raw_average_key_size": 25, "raw_value_size": 12128828, "raw_average_value_size": 1901, "num_data_blocks": 951, "num_entries": 6378, "num_filter_entries": 6378, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256618, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.573041) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12282613 bytes
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.575140) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 216.7 rd, 187.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 10.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(7.1) write-amplify(3.3) OK, records in: 6896, records dropped: 518 output_compression: NoCompression
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.575180) EVENT_LOG_v1 {"time_micros": 1759256618575161, "job": 42, "event": "compaction_finished", "compaction_time_micros": 65481, "compaction_time_cpu_micros": 38974, "output_level": 6, "num_output_files": 1, "total_output_size": 12282613, "num_input_records": 6896, "num_output_records": 6378, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618576783, "job": 42, "event": "table_file_deletion", "file_number": 71}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256618581021, "job": 42, "event": "table_file_deletion", "file_number": 69}
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.507032) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.581211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.581220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.581222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.581223) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:23:38.581225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:23:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:38.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:38 compute-1 nova_compute[238822]: 2025-09-30 18:23:38.933 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:23:38 compute-1 nova_compute[238822]: 2025-09-30 18:23:38.934 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.554s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:40.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/182340 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:23:40 compute-1 ceph-mon[75484]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:40.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:40 compute-1 unix_chkpwd[281611]: password check failed for user (root)
Sep 30 18:23:40 compute-1 sshd-session[281606]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:23:41 compute-1 nova_compute[238822]: 2025-09-30 18:23:41.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:42.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:42 compute-1 ceph-mon[75484]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:42.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:42 compute-1 sshd-session[281606]: Failed password for root from 192.210.160.141 port 48414 ssh2
Sep 30 18:23:42 compute-1 nova_compute[238822]: 2025-09-30 18:23:42.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:42 compute-1 nova_compute[238822]: 2025-09-30 18:23:42.929 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:42 compute-1 nova_compute[238822]: 2025-09-30 18:23:42.930 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:42 compute-1 nova_compute[238822]: 2025-09-30 18:23:42.930 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:43 compute-1 nova_compute[238822]: 2025-09-30 18:23:43.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:23:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:43 compute-1 sshd-session[281606]: Connection closed by authenticating user root 192.210.160.141 port 48414 [preauth]
Sep 30 18:23:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:44.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:44 compute-1 ceph-mon[75484]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 0 op/s
Sep 30 18:23:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:45 compute-1 podman[281616]: 2025-09-30 18:23:45.582882301 +0000 UTC m=+0.131441203 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Sep 30 18:23:45 compute-1 podman[281618]: 2025-09-30 18:23:45.611015169 +0000 UTC m=+0.137851884 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:23:45 compute-1 podman[281617]: 2025-09-30 18:23:45.640535137 +0000 UTC m=+0.176564740 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Sep 30 18:23:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:46.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Scheduled restart job, restart counter is at 14.
Sep 30 18:23:46 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:23:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Consumed 1.980s CPU time.
Sep 30 18:23:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Start request repeated too quickly.
Sep 30 18:23:46 compute-1 systemd[1]: ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b@nfs.cephfs.0.0.compute-1.bsnzkg.service: Failed with result 'exit-code'.
Sep 30 18:23:46 compute-1 systemd[1]: Failed to start Ceph nfs.cephfs.0.0.compute-1.bsnzkg for 63d32c6a-fa18-54ed-8711-9a3915cc367b.
Sep 30 18:23:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:46.337 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:33:a9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b746d23-00c6-4893-9766-0d92e4633a53', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b746d23-00c6-4893-9766-0d92e4633a53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31f85dcb85374df695d9e661ebe35eab', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1059929-d89b-4274-b16c-528ada6d21cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f6f0e667-67f4-4d4e-acc8-cfbe5cf0907b) old=Port_Binding(mac=['fa:16:3e:c7:33:a9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7b746d23-00c6-4893-9766-0d92e4633a53', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b746d23-00c6-4893-9766-0d92e4633a53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31f85dcb85374df695d9e661ebe35eab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:23:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:46.338 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f6f0e667-67f4-4d4e-acc8-cfbe5cf0907b in datapath 7b746d23-00c6-4893-9766-0d92e4633a53 updated
Sep 30 18:23:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:46.339 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b746d23-00c6-4893-9766-0d92e4633a53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:23:46 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:46.340 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f80a6632-5fd5-459b-91ee-a8e5ed2d1c91]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:23:46 compute-1 nova_compute[238822]: 2025-09-30 18:23:46.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:46 compute-1 ceph-mon[75484]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:46.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:47 compute-1 nova_compute[238822]: 2025-09-30 18:23:47.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:48.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:48 compute-1 ceph-mon[75484]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:48.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: ERROR   18:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: ERROR   18:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: ERROR   18:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: ERROR   18:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: ERROR   18:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:23:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:23:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:50.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:50 compute-1 ceph-mon[75484]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:50.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:51 compute-1 nova_compute[238822]: 2025-09-30 18:23:51.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:52.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:52 compute-1 unix_chkpwd[281684]: password check failed for user (root)
Sep 30 18:23:52 compute-1 sshd-session[281681]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:23:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:23:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:52.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:52 compute-1 nova_compute[238822]: 2025-09-30 18:23:52.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:53 compute-1 ceph-mon[75484]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:53 compute-1 sudo[281686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:23:53 compute-1 sudo[281686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:23:53 compute-1 sudo[281686]: pam_unix(sudo:session): session closed for user root
Sep 30 18:23:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:54 compute-1 sshd-session[281681]: Failed password for root from 175.126.165.170 port 46382 ssh2
Sep 30 18:23:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:23:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:54.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:23:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:54.379 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:23:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:54.379 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:23:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:23:54.379 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:23:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:54.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:55 compute-1 sshd-session[281681]: Received disconnect from 175.126.165.170 port 46382:11: Bye Bye [preauth]
Sep 30 18:23:55 compute-1 sshd-session[281681]: Disconnected from authenticating user root 175.126.165.170 port 46382 [preauth]
Sep 30 18:23:55 compute-1 sshd-session[281712]: Invalid user seekcy from 216.10.242.161 port 40220
Sep 30 18:23:55 compute-1 sshd-session[281712]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:23:55 compute-1 sshd-session[281712]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:23:55 compute-1 ceph-mon[75484]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:23:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:23:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:56.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:23:56 compute-1 nova_compute[238822]: 2025-09-30 18:23:56.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:56.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:57 compute-1 ceph-mon[75484]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:57 compute-1 sshd-session[281712]: Failed password for invalid user seekcy from 216.10.242.161 port 40220 ssh2
Sep 30 18:23:57 compute-1 nova_compute[238822]: 2025-09-30 18:23:57.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:23:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:23:58.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2428006301' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:23:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2428006301' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:23:58 compute-1 ceph-mon[75484]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:23:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:23:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:23:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:23:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:23:58.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:23:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:59 compute-1 ovn_controller[135204]: 2025-09-30T18:23:59Z|00135|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 18:23:59 compute-1 sshd-session[281712]: Received disconnect from 216.10.242.161 port 40220:11: Bye Bye [preauth]
Sep 30 18:23:59 compute-1 sshd-session[281712]: Disconnected from invalid user seekcy 216.10.242.161 port 40220 [preauth]
Sep 30 18:23:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:23:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:23:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:23:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:00.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:00 compute-1 ceph-mon[75484]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:00.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:01 compute-1 nova_compute[238822]: 2025-09-30 18:24:01.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:02.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:02 compute-1 ceph-mon[75484]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:02 compute-1 podman[281726]: 2025-09-30 18:24:02.570492659 +0000 UTC m=+0.095416128 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:24:02 compute-1 podman[281725]: 2025-09-30 18:24:02.653261064 +0000 UTC m=+0.186984111 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:24:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:02.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:02 compute-1 nova_compute[238822]: 2025-09-30 18:24:02.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:03 compute-1 unix_chkpwd[281775]: password check failed for user (root)
Sep 30 18:24:03 compute-1 sshd-session[281723]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:24:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:03 compute-1 sudo[281776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:24:03 compute-1 sudo[281776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:03 compute-1 sudo[281776]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:03 compute-1 sudo[281801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:24:03 compute-1 sudo[281801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:04.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:04 compute-1 sudo[281801]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:04 compute-1 ceph-mon[75484]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:24:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:04.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:05 compute-1 sshd-session[281723]: Failed password for root from 14.225.167.110 port 58070 ssh2
Sep 30 18:24:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:05.315 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:41:2a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08fc2cbd16474855b7ae474fa9859f76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5b6cbf18-1826-41d0-920f-e9db4f1a1832) old=Port_Binding(mac=['fa:16:3e:35:41:2a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08fc2cbd16474855b7ae474fa9859f76', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:24:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:05.317 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5b6cbf18-1826-41d0-920f-e9db4f1a1832 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 updated
Sep 30 18:24:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:05.317 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:24:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:05.318 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6284df-f413-46b7-87b1-0bb470370b1f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:24:05 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:24:05 compute-1 podman[249638]: time="2025-09-30T18:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:24:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:24:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8357 "" "Go-http-client/1.1"
Sep 30 18:24:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.852533) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645852577, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 512, "num_deletes": 255, "total_data_size": 674102, "memory_usage": 685320, "flush_reason": "Manual Compaction"}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645858451, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 442828, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38873, "largest_seqno": 39380, "table_properties": {"data_size": 440144, "index_size": 720, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6451, "raw_average_key_size": 18, "raw_value_size": 434658, "raw_average_value_size": 1241, "num_data_blocks": 33, "num_entries": 350, "num_filter_entries": 350, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256620, "oldest_key_time": 1759256620, "file_creation_time": 1759256645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 5957 microseconds, and 3377 cpu microseconds.
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.858489) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 442828 bytes OK
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.858506) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.860101) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.860114) EVENT_LOG_v1 {"time_micros": 1759256645860110, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.860129) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 671043, prev total WAL file size 671043, number of live WAL files 2.
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.860727) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303032' seq:72057594037927935, type:22 .. '6C6F676D0031323533' seq:0, type:0; will stop at (end)
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(432KB)], [72(11MB)]
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645860832, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 12725441, "oldest_snapshot_seqno": -1}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6207 keys, 12615942 bytes, temperature: kUnknown
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645914057, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12615942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12575382, "index_size": 23938, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15557, "raw_key_size": 161322, "raw_average_key_size": 25, "raw_value_size": 12464499, "raw_average_value_size": 2008, "num_data_blocks": 958, "num_entries": 6207, "num_filter_entries": 6207, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.914317) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12615942 bytes
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.915633) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.8 rd, 236.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 11.7 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(57.2) write-amplify(28.5) OK, records in: 6728, records dropped: 521 output_compression: NoCompression
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.915655) EVENT_LOG_v1 {"time_micros": 1759256645915644, "job": 44, "event": "compaction_finished", "compaction_time_micros": 53290, "compaction_time_cpu_micros": 29481, "output_level": 6, "num_output_files": 1, "total_output_size": 12615942, "num_input_records": 6728, "num_output_records": 6207, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645915895, "job": 44, "event": "table_file_deletion", "file_number": 74}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256645919127, "job": 44, "event": "table_file_deletion", "file_number": 72}
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.860452) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.919181) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.919185) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.919187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.919189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:05 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:24:05.919191) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:24:06 compute-1 unix_chkpwd[281860]: password check failed for user (root)
Sep 30 18:24:06 compute-1 sshd-session[281774]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:24:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:06.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:06 compute-1 nova_compute[238822]: 2025-09-30 18:24:06.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:06 compute-1 sshd-session[281723]: Received disconnect from 14.225.167.110 port 58070:11: Bye Bye [preauth]
Sep 30 18:24:06 compute-1 sshd-session[281723]: Disconnected from authenticating user root 14.225.167.110 port 58070 [preauth]
Sep 30 18:24:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:06.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:06 compute-1 ceph-mon[75484]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:07 compute-1 nova_compute[238822]: 2025-09-30 18:24:07.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:24:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:24:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:08.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:24:08 compute-1 sshd-session[281774]: Failed password for root from 192.210.160.141 port 56328 ssh2
Sep 30 18:24:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:08 compute-1 ceph-mon[75484]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:09 compute-1 sshd-session[281774]: Connection closed by authenticating user root 192.210.160.141 port 56328 [preauth]
Sep 30 18:24:09 compute-1 podman[281865]: 2025-09-30 18:24:09.545704839 +0000 UTC m=+0.091712448 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:24:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:09 compute-1 sudo[281884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:24:09 compute-1 sudo[281884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:09 compute-1 sudo[281884]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:10.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:24:10 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:24:10 compute-1 ceph-mon[75484]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:10.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:11 compute-1 nova_compute[238822]: 2025-09-30 18:24:11.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:12.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:12 compute-1 ceph-mon[75484]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:12.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:12 compute-1 nova_compute[238822]: 2025-09-30 18:24:12.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:13 compute-1 sudo[281913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:24:13 compute-1 sudo[281913]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:13 compute-1 sudo[281913]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:14.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:14 compute-1 ceph-mon[75484]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:24:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:14.492 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:24:14 compute-1 nova_compute[238822]: 2025-09-30 18:24:14.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:14.493 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:24:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:14.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:16.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:16 compute-1 nova_compute[238822]: 2025-09-30 18:24:16.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:16.483 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:36:fd 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a03d08ec-972f-45ad-9eed-86a07dbccb55', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a03d08ec-972f-45ad-9eed-86a07dbccb55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8f0f2ed-64e2-4f02-8e96-5263bb1056ff, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c1a76e60-8e45-44a5-9199-b4a5de182dea) old=Port_Binding(mac=['fa:16:3e:e4:36:fd'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a03d08ec-972f-45ad-9eed-86a07dbccb55', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a03d08ec-972f-45ad-9eed-86a07dbccb55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:24:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:16.484 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c1a76e60-8e45-44a5-9199-b4a5de182dea in datapath a03d08ec-972f-45ad-9eed-86a07dbccb55 updated
Sep 30 18:24:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:16.485 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a03d08ec-972f-45ad-9eed-86a07dbccb55, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:24:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:16.487 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0c5dd9-3df9-4e85-9e7d-b30245118080]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:24:16 compute-1 ceph-mon[75484]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:16 compute-1 podman[281942]: 2025-09-30 18:24:16.548669237 +0000 UTC m=+0.091434070 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Sep 30 18:24:16 compute-1 podman[281944]: 2025-09-30 18:24:16.559324305 +0000 UTC m=+0.090392733 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd)
Sep 30 18:24:16 compute-1 podman[281943]: 2025-09-30 18:24:16.573512828 +0000 UTC m=+0.103127776 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:24:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:16.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:17 compute-1 nova_compute[238822]: 2025-09-30 18:24:17.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:18.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:18 compute-1 ceph-mon[75484]: pgmap v1370: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:18.495 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:24:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: ERROR   18:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: ERROR   18:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: ERROR   18:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: ERROR   18:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: ERROR   18:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:24:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:24:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:20.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:20 compute-1 ceph-mon[75484]: pgmap v1371: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:20.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:21 compute-1 nova_compute[238822]: 2025-09-30 18:24:21.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:22.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:22 compute-1 sshd-session[282005]: Invalid user superadmin from 103.153.190.105 port 37537
Sep 30 18:24:22 compute-1 sshd-session[282005]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:24:22 compute-1 sshd-session[282005]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:24:22 compute-1 sshd-session[282008]: Invalid user debian from 84.51.43.58 port 63849
Sep 30 18:24:22 compute-1 sshd-session[282008]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:24:22 compute-1 sshd-session[282008]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:24:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:24:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:22.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:22 compute-1 nova_compute[238822]: 2025-09-30 18:24:22.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:23 compute-1 ceph-mon[75484]: pgmap v1372: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:24.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:24 compute-1 sshd-session[282005]: Failed password for invalid user superadmin from 103.153.190.105 port 37537 ssh2
Sep 30 18:24:24 compute-1 sshd-session[282008]: Failed password for invalid user debian from 84.51.43.58 port 63849 ssh2
Sep 30 18:24:24 compute-1 sshd-session[282008]: Received disconnect from 84.51.43.58 port 63849:11: Bye Bye [preauth]
Sep 30 18:24:24 compute-1 sshd-session[282008]: Disconnected from invalid user debian 84.51.43.58 port 63849 [preauth]
Sep 30 18:24:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:24.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:25 compute-1 ceph-mon[75484]: pgmap v1373: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:24:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:26.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:26 compute-1 sshd-session[282005]: Received disconnect from 103.153.190.105 port 37537:11: Bye Bye [preauth]
Sep 30 18:24:26 compute-1 sshd-session[282005]: Disconnected from invalid user superadmin 103.153.190.105 port 37537 [preauth]
Sep 30 18:24:26 compute-1 nova_compute[238822]: 2025-09-30 18:24:26.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:26.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:27 compute-1 ceph-mon[75484]: pgmap v1374: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:27 compute-1 nova_compute[238822]: 2025-09-30 18:24:27.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:28.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:28 compute-1 ceph-mon[75484]: pgmap v1375: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:28.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:30 compute-1 nova_compute[238822]: 2025-09-30 18:24:30.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:30 compute-1 nova_compute[238822]: 2025-09-30 18:24:30.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:24:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:30.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:30 compute-1 ceph-mon[75484]: pgmap v1376: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:24:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:30.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:24:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:31 compute-1 nova_compute[238822]: 2025-09-30 18:24:31.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2520679321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:32.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:32 compute-1 unix_chkpwd[282023]: password check failed for user (root)
Sep 30 18:24:32 compute-1 sshd-session[282019]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:24:32 compute-1 ceph-mon[75484]: pgmap v1377: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:32.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:32 compute-1 nova_compute[238822]: 2025-09-30 18:24:32.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:33 compute-1 podman[282026]: 2025-09-30 18:24:33.563392149 +0000 UTC m=+0.097314039 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:24:33 compute-1 podman[282025]: 2025-09-30 18:24:33.575972028 +0000 UTC m=+0.121816540 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 18:24:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:33 compute-1 sudo[282075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:24:33 compute-1 sudo[282075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:33 compute-1 sudo[282075]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:34 compute-1 nova_compute[238822]: 2025-09-30 18:24:34.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:34.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:34 compute-1 sshd-session[282019]: Failed password for root from 192.210.160.141 port 46444 ssh2
Sep 30 18:24:34 compute-1 ceph-mon[75484]: pgmap v1378: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:24:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:34.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:35 compute-1 sshd-session[282019]: Connection closed by authenticating user root 192.210.160.141 port 46444 [preauth]
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:24:35 compute-1 nova_compute[238822]: 2025-09-30 18:24:35.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:35 compute-1 podman[249638]: time="2025-09-30T18:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:24:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:24:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8356 "" "Go-http-client/1.1"
Sep 30 18:24:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:24:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2280115228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.040 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2280115228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:36.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.370 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.372 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.411 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.412 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4778MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.412 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.413 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:36 compute-1 nova_compute[238822]: 2025-09-30 18:24:36.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:36.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:37 compute-1 ceph-mon[75484]: pgmap v1379: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2209775993' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:24:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2209775993' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:24:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/878725520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.467 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.468 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:24:36 up  4:01,  0 user,  load average: 0.20, 0.42, 0.80\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.495 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:24:37 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/510085707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.933 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:37 compute-1 nova_compute[238822]: 2025-09-30 18:24:37.941 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:24:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:24:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/510085707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:38.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:38 compute-1 nova_compute[238822]: 2025-09-30 18:24:38.457 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:24:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:38.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:38 compute-1 nova_compute[238822]: 2025-09-30 18:24:38.966 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:24:38 compute-1 nova_compute[238822]: 2025-09-30 18:24:38.967 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.554s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:39 compute-1 ceph-mon[75484]: pgmap v1380: 353 pgs: 353 active+clean; 41 MiB data, 246 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:24:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2232404500' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:40 compute-1 nova_compute[238822]: 2025-09-30 18:24:40.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:40 compute-1 nova_compute[238822]: 2025-09-30 18:24:40.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:40.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1824539300' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:24:40 compute-1 podman[282152]: 2025-09-30 18:24:40.527775776 +0000 UTC m=+0.068013848 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:24:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:40.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:41 compute-1 ceph-mon[75484]: pgmap v1381: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:24:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/56867910' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:24:41 compute-1 nova_compute[238822]: 2025-09-30 18:24:41.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:24:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:42.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:24:42 compute-1 ceph-mon[75484]: pgmap v1382: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:24:42 compute-1 nova_compute[238822]: 2025-09-30 18:24:42.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:42.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:42 compute-1 nova_compute[238822]: 2025-09-30 18:24:42.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:43 compute-1 nova_compute[238822]: 2025-09-30 18:24:43.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:43 compute-1 nova_compute[238822]: 2025-09-30 18:24:43.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:24:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:44.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:24:44 compute-1 ceph-mon[75484]: pgmap v1383: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:24:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:44.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:46.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:46 compute-1 nova_compute[238822]: 2025-09-30 18:24:46.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:46 compute-1 ceph-mon[75484]: pgmap v1384: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:24:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:46.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:47 compute-1 podman[282178]: 2025-09-30 18:24:47.527062665 +0000 UTC m=+0.068048259 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:24:47 compute-1 podman[282179]: 2025-09-30 18:24:47.541182806 +0000 UTC m=+0.067651128 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:24:47 compute-1 podman[282185]: 2025-09-30 18:24:47.545666507 +0000 UTC m=+0.070130945 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:24:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:47 compute-1 nova_compute[238822]: 2025-09-30 18:24:47.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:48 compute-1 nova_compute[238822]: 2025-09-30 18:24:48.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:48 compute-1 nova_compute[238822]: 2025-09-30 18:24:48.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:24:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:48.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:48 compute-1 ceph-mon[75484]: pgmap v1385: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.1 MiB/s rd, 1.8 MiB/s wr, 69 op/s
Sep 30 18:24:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:48.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: ERROR   18:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: ERROR   18:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: ERROR   18:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: ERROR   18:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: ERROR   18:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:24:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:24:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:50.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:50 compute-1 ceph-mon[75484]: pgmap v1386: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Sep 30 18:24:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:50.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:51 compute-1 nova_compute[238822]: 2025-09-30 18:24:51.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:51 compute-1 nova_compute[238822]: 2025-09-30 18:24:51.608 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:51 compute-1 nova_compute[238822]: 2025-09-30 18:24:51.609 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.115 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:24:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:52.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.569 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.570 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.684 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.685 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.696 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.696 2 INFO nova.compute.claims [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:24:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:24:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:52.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:24:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:52 compute-1 nova_compute[238822]: 2025-09-30 18:24:52.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:53 compute-1 nova_compute[238822]: 2025-09-30 18:24:53.076 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:24:53 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:24:53 compute-1 ceph-mon[75484]: pgmap v1387: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:24:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:53 compute-1 nova_compute[238822]: 2025-09-30 18:24:53.760 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:53 compute-1 sudo[282245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:24:53 compute-1 sudo[282245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:24:53 compute-1 sudo[282245]: pam_unix(sudo:session): session closed for user root
Sep 30 18:24:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:54.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:24:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4264755529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:54 compute-1 nova_compute[238822]: 2025-09-30 18:24:54.314 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.554s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:54 compute-1 nova_compute[238822]: 2025-09-30 18:24:54.323 2 DEBUG nova.compute.provider_tree [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:24:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:54.381 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:54.381 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:24:54.381 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4264755529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:24:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:54 compute-1 nova_compute[238822]: 2025-09-30 18:24:54.837 2 DEBUG nova.scheduler.client.report [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:24:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:54.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.352 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.667s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.353 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:24:55 compute-1 ceph-mon[75484]: pgmap v1388: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:24:55 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:24:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.868 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.869 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.870 2 WARNING neutronclient.v2_0.client [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:24:55 compute-1 nova_compute[238822]: 2025-09-30 18:24:55.871 2 WARNING neutronclient.v2_0.client [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:24:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:56.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:56 compute-1 nova_compute[238822]: 2025-09-30 18:24:56.380 2 INFO nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:24:56 compute-1 nova_compute[238822]: 2025-09-30 18:24:56.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:56.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:56 compute-1 nova_compute[238822]: 2025-09-30 18:24:56.895 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:24:57 compute-1 ceph-mon[75484]: pgmap v1389: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:24:57 compute-1 sshd-session[282300]: Invalid user notes from 216.10.242.161 port 59620
Sep 30 18:24:57 compute-1 sshd-session[282300]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:24:57 compute-1 sshd-session[282300]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:24:57 compute-1 sshd-session[282295]: Invalid user debian from 192.210.160.141 port 38112
Sep 30 18:24:57 compute-1 sshd-session[282295]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:24:57 compute-1 sshd-session[282295]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:24:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.916 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.918 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.918 2 INFO nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Creating image(s)
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.963 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:24:57 compute-1 nova_compute[238822]: 2025-09-30 18:24:57.997 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.030 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.034 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.117 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.118 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.120 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.121 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:24:58.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.162 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.168 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 05075776-ca3e-4416-bdd4-558a62d1cf69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.346 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Successfully created port: 295c346d-8de9-4a50-883e-9a7e1ccdccc7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:24:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/861102180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:24:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/861102180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.484 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 05075776-ca3e-4416-bdd4-558a62d1cf69_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.582 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] resizing rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:24:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.721 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.721 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Ensure instance console log exists: /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.722 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.722 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:24:58 compute-1 nova_compute[238822]: 2025-09-30 18:24:58.723 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:24:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:24:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:24:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:24:58.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:24:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:59 compute-1 sshd-session[282300]: Failed password for invalid user notes from 216.10.242.161 port 59620 ssh2
Sep 30 18:24:59 compute-1 sshd-session[282295]: Failed password for invalid user debian from 192.210.160.141 port 38112 ssh2
Sep 30 18:24:59 compute-1 ceph-mon[75484]: pgmap v1390: 353 pgs: 353 active+clean; 121 MiB data, 279 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.4 MiB/s wr, 116 op/s
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.562 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Successfully updated port: 295c346d-8de9-4a50-883e-9a7e1ccdccc7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:24:59 compute-1 sshd-session[282300]: Received disconnect from 216.10.242.161 port 59620:11: Bye Bye [preauth]
Sep 30 18:24:59 compute-1 sshd-session[282300]: Disconnected from invalid user notes 216.10.242.161 port 59620 [preauth]
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.625 2 DEBUG nova.compute.manager [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-changed-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.625 2 DEBUG nova.compute.manager [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Refreshing instance network info cache due to event network-changed-295c346d-8de9-4a50-883e-9a7e1ccdccc7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.626 2 DEBUG oslo_concurrency.lockutils [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.626 2 DEBUG oslo_concurrency.lockutils [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:24:59 compute-1 nova_compute[238822]: 2025-09-30 18:24:59.626 2 DEBUG nova.network.neutron [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Refreshing network info cache for port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:24:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:24:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:24:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:24:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:00 compute-1 nova_compute[238822]: 2025-09-30 18:25:00.069 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:25:00 compute-1 nova_compute[238822]: 2025-09-30 18:25:00.133 2 WARNING neutronclient.v2_0.client [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:25:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:00.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:00 compute-1 sshd-session[282295]: Connection closed by invalid user debian 192.210.160.141 port 38112 [preauth]
Sep 30 18:25:00 compute-1 nova_compute[238822]: 2025-09-30 18:25:00.321 2 DEBUG nova.network.neutron [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:25:00 compute-1 ceph-mon[75484]: pgmap v1391: 353 pgs: 353 active+clean; 157 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 1.1 MiB/s rd, 3.5 MiB/s wr, 105 op/s
Sep 30 18:25:00 compute-1 nova_compute[238822]: 2025-09-30 18:25:00.516 2 DEBUG nova.network.neutron [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:25:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:25:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:00.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:25:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:01 compute-1 nova_compute[238822]: 2025-09-30 18:25:01.023 2 DEBUG oslo_concurrency.lockutils [req-f0414c91-6231-4f60-93e3-1b2d64301179 req-f176a2f3-5e0d-4462-b9d6-7d620b4fc022 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:25:01 compute-1 nova_compute[238822]: 2025-09-30 18:25:01.024 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquired lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:25:01 compute-1 nova_compute[238822]: 2025-09-30 18:25:01.024 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:25:01 compute-1 nova_compute[238822]: 2025-09-30 18:25:01.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:01 compute-1 unix_chkpwd[282476]: password check failed for user (root)
Sep 30 18:25:01 compute-1 sshd-session[282473]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:25:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:02.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:02 compute-1 nova_compute[238822]: 2025-09-30 18:25:02.312 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:25:02 compute-1 ceph-mon[75484]: pgmap v1392: 353 pgs: 353 active+clean; 157 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 294 KiB/s rd, 3.5 MiB/s wr, 74 op/s
Sep 30 18:25:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:02.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:02 compute-1 nova_compute[238822]: 2025-09-30 18:25:02.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:03 compute-1 sshd-session[282473]: Failed password for root from 175.126.165.170 port 58082 ssh2
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.233 2 WARNING neutronclient.v2_0.client [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.462 2 DEBUG nova.network.neutron [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Updating instance_info_cache with network_info: [{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:25:03 compute-1 sshd-session[282473]: Received disconnect from 175.126.165.170 port 58082:11: Bye Bye [preauth]
Sep 30 18:25:03 compute-1 sshd-session[282473]: Disconnected from authenticating user root 175.126.165.170 port 58082 [preauth]
Sep 30 18:25:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.971 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Releasing lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.972 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance network_info: |[{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.978 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Start _get_guest_xml network_info=[{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.985 2 WARNING nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.988 2 DEBUG nova.virt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-763414646', uuid='05075776-ca3e-4416-bdd4-558a62d1cf69'), owner=OwnerMeta(userid='623ef4a55c9e4fc28bb65e49246b5008', username='tempest-TestExecuteStrategies-1883747907-project-admin', projectid='c634e1c17ed54907969576a0eb8eff50', projectname='tempest-TestExecuteStrategies-1883747907'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256703.988254) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.993 2 DEBUG nova.virt.libvirt.host [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.994 2 DEBUG nova.virt.libvirt.host [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:25:03 compute-1 nova_compute[238822]: 2025-09-30 18:25:03.998 2 DEBUG nova.virt.libvirt.host [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.000 2 DEBUG nova.virt.libvirt.host [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.000 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.001 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.002 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.002 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.002 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.003 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.003 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.003 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.004 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.004 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.004 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.005 2 DEBUG nova.virt.hardware [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.010 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:04.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:25:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3160135630' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:25:04 compute-1 ceph-mon[75484]: pgmap v1393: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 304 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:25:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3160135630' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.493 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.543 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:25:04 compute-1 nova_compute[238822]: 2025-09-30 18:25:04.556 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:04 compute-1 podman[282501]: 2025-09-30 18:25:04.578608322 +0000 UTC m=+0.118141662 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:25:04 compute-1 podman[282500]: 2025-09-30 18:25:04.594169312 +0000 UTC m=+0.133650500 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 18:25:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:04.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:25:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/187030609' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.000 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.003 2 DEBUG nova.virt.libvirt.vif [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-763414646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-763414646',id=17,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-u3g6u6z0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:24:56Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=05075776-ca3e-4416-bdd4-558a62d1cf69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.004 2 DEBUG nova.network.os_vif_util [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.007 2 DEBUG nova.network.os_vif_util [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.008 2 DEBUG nova.objects.instance [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 05075776-ca3e-4416-bdd4-558a62d1cf69 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:25:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/187030609' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.519 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <uuid>05075776-ca3e-4416-bdd4-558a62d1cf69</uuid>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <name>instance-00000011</name>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-763414646</nova:name>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:25:03</nova:creationTime>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:25:05 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:25:05 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <nova:port uuid="295c346d-8de9-4a50-883e-9a7e1ccdccc7">
Sep 30 18:25:05 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <system>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="serial">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="uuid">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </system>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <os>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </os>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <features>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </features>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:05 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:ec:82:fd"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <target dev="tap295c346d-8d"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <video>
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </video>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:25:05 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:25:05 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:25:05 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:25:05 compute-1 nova_compute[238822]: </domain>
Sep 30 18:25:05 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.522 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Preparing to wait for external event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.523 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.523 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.524 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.525 2 DEBUG nova.virt.libvirt.vif [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-763414646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-763414646',id=17,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-u3g6u6z0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:24:56Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=05075776-ca3e-4416-bdd4-558a62d1cf69,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.526 2 DEBUG nova.network.os_vif_util [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.527 2 DEBUG nova.network.os_vif_util [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.528 2 DEBUG os_vif [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.530 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.530 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.532 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ddddbef0-2ef1-507f-a9ee-3f4ffec99c70', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap295c346d-8d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap295c346d-8d, col_values=(('qos', UUID('b4208cf0-74bf-4a81-a443-aad29da88d64')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap295c346d-8d, col_values=(('external_ids', {'iface-id': '295c346d-8de9-4a50-883e-9a7e1ccdccc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:82:fd', 'vm-uuid': '05075776-ca3e-4416-bdd4-558a62d1cf69'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 NetworkManager[45549]: <info>  [1759256705.5510] manager: (tap295c346d-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:05 compute-1 nova_compute[238822]: 2025-09-30 18:25:05.560 2 INFO os_vif [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d')
Sep 30 18:25:05 compute-1 podman[249638]: time="2025-09-30T18:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:25:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:25:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8355 "" "Go-http-client/1.1"
Sep 30 18:25:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:06.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:06 compute-1 ceph-mon[75484]: pgmap v1394: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 304 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:25:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:06.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.106 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.106 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.107 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No VIF found with MAC fa:16:3e:ec:82:fd, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.107 2 INFO nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Using config drive
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.151 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:25:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.673 2 WARNING neutronclient.v2_0.client [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:25:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:07 compute-1 nova_compute[238822]: 2025-09-30 18:25:07.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:08.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.430 2 INFO nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Creating config drive at /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.441 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpohp2uruf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:08 compute-1 ceph-mon[75484]: pgmap v1395: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 304 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.576 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpohp2uruf" returned: 0 in 0.135s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.629 2 DEBUG nova.storage.rbd_utils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.636 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config 05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.843 2 DEBUG oslo_concurrency.processutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config 05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.207s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:08 compute-1 nova_compute[238822]: 2025-09-30 18:25:08.845 2 INFO nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Deleting local config drive /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/disk.config because it was imported into RBD.
Sep 30 18:25:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:08.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:08 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:25:08 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.0047] manager: (tap295c346d-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Sep 30 18:25:09 compute-1 kernel: tap295c346d-8d: entered promiscuous mode
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 ovn_controller[135204]: 2025-09-30T18:25:09Z|00136|binding|INFO|Claiming lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 for this chassis.
Sep 30 18:25:09 compute-1 ovn_controller[135204]: 2025-09-30T18:25:09Z|00137|binding|INFO|295c346d-8de9-4a50-883e-9a7e1ccdccc7: Claiming fa:16:3e:ec:82:fd 10.100.0.9
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.040 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:82:fd 10.100.0.9'], port_security=['fa:16:3e:ec:82:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '05075776-ca3e-4416-bdd4-558a62d1cf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=295c346d-8de9-4a50-883e-9a7e1ccdccc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.041 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 bound to our chassis
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.043 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:25:09 compute-1 systemd-udevd[282686]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.060 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d635ef7b-1b0b-46b2-b83c-96287fc0f699]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.061 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.0657] device (tap295c346d-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.0681] device (tap295c346d-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.063 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.069 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9ddcf2c1-d9e6-4748-919a-0ee595eaa66d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.070 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fb938d42-9019-4b6c-9840-e7e93ac5ce48]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 systemd-machined[195911]: New machine qemu-12-instance-00000011.
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.088 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[ff66b4a9-8a77-4f88-92be-25ea62f9aacc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.112 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[10462a9c-db76-4180-9793-bcf7504b6b62]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_controller[135204]: 2025-09-30T18:25:09Z|00138|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 ovn-installed in OVS
Sep 30 18:25:09 compute-1 ovn_controller[135204]: 2025-09-30T18:25:09Z|00139|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 up in Southbound
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.153 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[aadb59d4-488f-4569-8cb1-fa5103131826]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.160 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bafa7e5b-c9db-4ad6-9fdc-f2adeb6eb517]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 systemd-udevd[282692]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.1630] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.209 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab1271d-2937-4d0e-8a6c-f52af8c7b758]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.212 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[11987743-a4d2-4a55-9325-7eb26a67442c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.2447] device (tap6901f664-30): carrier: link connected
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.256 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8a9133-2929-4a91-9f62-1d45b46093b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.281 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2ba80d-5927-4503-ae0d-4830c8d4d017]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1455082, 'reachable_time': 20483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282722, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.304 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2b8a8e-f5ba-4532-97fe-44f741f48287]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1455082, 'tstamp': 1455082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282723, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.326 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2e2995-0c13-4580-9ab0-f980f899b0b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1455082, 'reachable_time': 20483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282724, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.371 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[42c6c1f1-3093-4130-9596-abd03f6793f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.423 2 DEBUG nova.compute.manager [req-46287066-ddf0-49ac-b22a-5d402f77822d req-c6714c3d-5eb8-4951-a4ca-161222586c61 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.424 2 DEBUG oslo_concurrency.lockutils [req-46287066-ddf0-49ac-b22a-5d402f77822d req-c6714c3d-5eb8-4951-a4ca-161222586c61 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.424 2 DEBUG oslo_concurrency.lockutils [req-46287066-ddf0-49ac-b22a-5d402f77822d req-c6714c3d-5eb8-4951-a4ca-161222586c61 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.424 2 DEBUG oslo_concurrency.lockutils [req-46287066-ddf0-49ac-b22a-5d402f77822d req-c6714c3d-5eb8-4951-a4ca-161222586c61 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.425 2 DEBUG nova.compute.manager [req-46287066-ddf0-49ac-b22a-5d402f77822d req-c6714c3d-5eb8-4951-a4ca-161222586c61 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Processing event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:25:09 compute-1 sshd-session[282471]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:25:09 compute-1 sshd-session[282471]: banner exchange: Connection from 110.42.70.108 port 35734: Connection timed out
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.469 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[788e7095-63e0-486c-bb88-6bef679fc425]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.471 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.472 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.472 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 NetworkManager[45549]: <info>  [1759256709.4752] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Sep 30 18:25:09 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.479 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:09 compute-1 ovn_controller[135204]: 2025-09-30T18:25:09Z|00140|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 nova_compute[238822]: 2025-09-30 18:25:09.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.511 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7a420d96-74c9-4bb7-a41f-c34f43588874]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.513 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.513 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.513 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.513 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.514 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca2e0a5-2d26-49c7-a016-4bdded357f84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.515 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.515 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[79fdd46a-5ec5-44be-8073-4f610a467fa2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.516 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:25:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:09.517 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:25:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:09 compute-1 podman[282798]: 2025-09-30 18:25:09.978351801 +0000 UTC m=+0.063689541 container create aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Sep 30 18:25:10 compute-1 podman[282798]: 2025-09-30 18:25:09.94166151 +0000 UTC m=+0.026999240 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:25:10 compute-1 systemd[1]: Started libpod-conmon-aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560.scope.
Sep 30 18:25:10 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:25:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2885721e14c4034d090c6f96a472a689002bca060a056b1a3c984b602defe5d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:25:10 compute-1 podman[282798]: 2025-09-30 18:25:10.09902774 +0000 UTC m=+0.184365460 container init aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:25:10 compute-1 podman[282798]: 2025-09-30 18:25:10.106184553 +0000 UTC m=+0.191522263 container start aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 18:25:10 compute-1 sudo[282813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.135 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:25:10 compute-1 sudo[282813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.141 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:25:10 compute-1 sudo[282813]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.145 2 INFO nova.virt.libvirt.driver [-] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance spawned successfully.
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.146 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:25:10 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [NOTICE]   (282841) : New worker (282847) forked
Sep 30 18:25:10 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [NOTICE]   (282841) : Loading success.
Sep 30 18:25:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:10.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:10 compute-1 sudo[282856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:25:10 compute-1 sudo[282856]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:10 compute-1 ceph-mon[75484]: pgmap v1396: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 144 KiB/s rd, 2.5 MiB/s wr, 45 op/s
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.665 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.666 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.667 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.668 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.669 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 nova_compute[238822]: 2025-09-30 18:25:10.670 2 DEBUG nova.virt.libvirt.driver [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:25:10 compute-1 sshd-session[282842]: Invalid user jenkins from 8.243.64.201 port 35356
Sep 30 18:25:10 compute-1 sshd-session[282842]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:25:10 compute-1 sshd-session[282842]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:25:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:10 compute-1 podman[282899]: 2025-09-30 18:25:10.850679621 +0000 UTC m=+0.105979793 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:25:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:10.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:10 compute-1 sudo[282856]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.184 2 INFO nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Took 13.27 seconds to spawn the instance on the hypervisor.
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.185 2 DEBUG nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:25:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.519 2 DEBUG nova.compute.manager [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.520 2 DEBUG oslo_concurrency.lockutils [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.520 2 DEBUG oslo_concurrency.lockutils [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.521 2 DEBUG oslo_concurrency.lockutils [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.521 2 DEBUG nova.compute.manager [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.521 2 WARNING nova.compute.manager [req-128dfd88-2a0a-4556-ae06-cb46c3a71a1b req-99285333-d473-44fc-8f25-b47db8580b5c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state None.
Sep 30 18:25:11 compute-1 nova_compute[238822]: 2025-09-30 18:25:11.721 2 INFO nova.compute.manager [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Took 19.10 seconds to build instance.
Sep 30 18:25:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:12.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:12 compute-1 nova_compute[238822]: 2025-09-30 18:25:12.226 2 DEBUG oslo_concurrency.lockutils [None req-5d3dd2b4-0909-4739-9226-4385700842f2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.618s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:12 compute-1 ceph-mon[75484]: pgmap v1397: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 445 KiB/s wr, 14 op/s
Sep 30 18:25:12 compute-1 sshd-session[282932]: Invalid user seekcy from 14.225.167.110 port 58566
Sep 30 18:25:12 compute-1 sshd-session[282932]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:25:12 compute-1 sshd-session[282932]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:25:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:12.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:12 compute-1 sshd-session[282842]: Failed password for invalid user jenkins from 8.243.64.201 port 35356 ssh2
Sep 30 18:25:12 compute-1 nova_compute[238822]: 2025-09-30 18:25:12.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:13 compute-1 sudo[282936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:25:13 compute-1 sudo[282936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:13 compute-1 sudo[282936]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:14 compute-1 sshd-session[282932]: Failed password for invalid user seekcy from 14.225.167.110 port 58566 ssh2
Sep 30 18:25:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:14.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:14 compute-1 ceph-mon[75484]: pgmap v1398: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 459 KiB/s wr, 88 op/s
Sep 30 18:25:14 compute-1 sshd-session[282932]: Received disconnect from 14.225.167.110 port 58566:11: Bye Bye [preauth]
Sep 30 18:25:14 compute-1 sshd-session[282932]: Disconnected from invalid user seekcy 14.225.167.110 port 58566 [preauth]
Sep 30 18:25:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:14.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:15 compute-1 sshd-session[282842]: Received disconnect from 8.243.64.201 port 35356:11: Bye Bye [preauth]
Sep 30 18:25:15 compute-1 sshd-session[282842]: Disconnected from invalid user jenkins 8.243.64.201 port 35356 [preauth]
Sep 30 18:25:15 compute-1 nova_compute[238822]: 2025-09-30 18:25:15.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:16.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:16 compute-1 sudo[282964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:25:16 compute-1 sudo[282964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:16 compute-1 sudo[282964]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:25:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:16.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:25:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:25:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:25:17 compute-1 ceph-mon[75484]: pgmap v1399: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:25:17 compute-1 sshd-session[282597]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:25:17 compute-1 sshd-session[282597]: banner exchange: Connection from 113.249.93.94 port 11766: Connection timed out
Sep 30 18:25:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:17 compute-1 nova_compute[238822]: 2025-09-30 18:25:17.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:18.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:18 compute-1 ceph-mon[75484]: pgmap v1400: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:25:18 compute-1 podman[282991]: 2025-09-30 18:25:18.564215262 +0000 UTC m=+0.088804040 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:25:18 compute-1 podman[282993]: 2025-09-30 18:25:18.599055863 +0000 UTC m=+0.118664886 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:25:18 compute-1 podman[282992]: 2025-09-30 18:25:18.606051402 +0000 UTC m=+0.124300139 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Sep 30 18:25:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:18 compute-1 sshd[170789]: Timeout before authentication for connection from 110.42.70.108 to 38.102.83.102, pid = 281285
Sep 30 18:25:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:25:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:18.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: ERROR   18:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: ERROR   18:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: ERROR   18:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: ERROR   18:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: ERROR   18:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:25:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:25:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:20.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:20 compute-1 ceph-mon[75484]: pgmap v1401: 353 pgs: 353 active+clean; 167 MiB data, 311 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:25:20 compute-1 nova_compute[238822]: 2025-09-30 18:25:20.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:20.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:22 compute-1 unix_chkpwd[283054]: password check failed for user (root)
Sep 30 18:25:22 compute-1 sshd-session[283049]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:25:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:22.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:25:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:22.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:22 compute-1 nova_compute[238822]: 2025-09-30 18:25:22.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:23 compute-1 ovn_controller[135204]: 2025-09-30T18:25:23Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:82:fd 10.100.0.9
Sep 30 18:25:23 compute-1 ovn_controller[135204]: 2025-09-30T18:25:23Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:82:fd 10.100.0.9
Sep 30 18:25:23 compute-1 ceph-mon[75484]: pgmap v1402: 353 pgs: 353 active+clean; 167 MiB data, 311 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 73 op/s
Sep 30 18:25:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:24 compute-1 sshd-session[283049]: Failed password for root from 192.210.160.141 port 38900 ssh2
Sep 30 18:25:24 compute-1 ceph-mon[75484]: pgmap v1403: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 137 op/s
Sep 30 18:25:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:24.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:25 compute-1 sshd-session[283049]: Connection closed by authenticating user root 192.210.160.141 port 38900 [preauth]
Sep 30 18:25:25 compute-1 nova_compute[238822]: 2025-09-30 18:25:25.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:26.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:26 compute-1 ceph-mon[75484]: pgmap v1404: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:25:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:26.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:27 compute-1 nova_compute[238822]: 2025-09-30 18:25:27.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:28 compute-1 ceph-mon[75484]: pgmap v1405: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:25:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:28.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:30 compute-1 ceph-mon[75484]: pgmap v1406: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:25:30 compute-1 nova_compute[238822]: 2025-09-30 18:25:30.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:30.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:31 compute-1 nova_compute[238822]: 2025-09-30 18:25:31.565 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:31 compute-1 nova_compute[238822]: 2025-09-30 18:25:31.565 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:25:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:32.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:32 compute-1 ceph-mon[75484]: pgmap v1407: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:25:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:32.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:33 compute-1 nova_compute[238822]: 2025-09-30 18:25:33.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:34 compute-1 nova_compute[238822]: 2025-09-30 18:25:34.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:34 compute-1 sudo[283066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:25:34 compute-1 sudo[283066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:34 compute-1 sudo[283066]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:25:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:34.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:25:34 compute-1 ceph-mon[75484]: pgmap v1408: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:25:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:34.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:35 compute-1 podman[283094]: 2025-09-30 18:25:35.568443798 +0000 UTC m=+0.100657390 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.580 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:35 compute-1 podman[283093]: 2025-09-30 18:25:35.609398784 +0000 UTC m=+0.141305517 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 18:25:35 compute-1 podman[249638]: time="2025-09-30T18:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:25:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:25:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8811 "" "Go-http-client/1.1"
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.723 2 DEBUG nova.compute.manager [None req-a990604e-4e78-4b36-9472-4b01e32f59e3 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 18:25:35 compute-1 nova_compute[238822]: 2025-09-30 18:25:35.793 2 DEBUG nova.compute.provider_tree [None req-a990604e-4e78-4b36-9472-4b01e32f59e3 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 16 to 20 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:25:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:25:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2006376901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:36 compute-1 nova_compute[238822]: 2025-09-30 18:25:36.052 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2006376901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:36.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:36.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.115 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.116 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000011 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:25:37 compute-1 ceph-mon[75484]: pgmap v1409: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 15 KiB/s wr, 0 op/s
Sep 30 18:25:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2752651912' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:25:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2752651912' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.363 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.365 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.394 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.396 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4543MB free_disk=39.90116500854492GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.396 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:37 compute-1 nova_compute[238822]: 2025-09-30 18:25:37.397 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:25:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:25:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:38.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.507 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 05075776-ca3e-4416-bdd4-558a62d1cf69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.508 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.509 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:25:37 up  4:02,  0 user,  load average: 0.47, 0.45, 0.78\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.567 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.624 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.625 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.638 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.661 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STATUS_DISABLED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:25:38 compute-1 nova_compute[238822]: 2025-09-30 18:25:38.692 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:25:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:38.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:25:39 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/376485844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:39 compute-1 nova_compute[238822]: 2025-09-30 18:25:39.156 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:25:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2733581921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:39 compute-1 ceph-mon[75484]: pgmap v1410: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:25:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/376485844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:39 compute-1 nova_compute[238822]: 2025-09-30 18:25:39.166 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:25:39 compute-1 ovn_controller[135204]: 2025-09-30T18:25:39Z|00141|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 18:25:39 compute-1 unix_chkpwd[283193]: password check failed for user (root)
Sep 30 18:25:39 compute-1 sshd-session[283168]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:25:39 compute-1 nova_compute[238822]: 2025-09-30 18:25:39.677 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:25:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:40 compute-1 nova_compute[238822]: 2025-09-30 18:25:40.189 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:25:40 compute-1 nova_compute[238822]: 2025-09-30 18:25:40.189 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.792s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:40.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:40 compute-1 ceph-mon[75484]: pgmap v1411: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 3.3 KiB/s wr, 0 op/s
Sep 30 18:25:40 compute-1 nova_compute[238822]: 2025-09-30 18:25:40.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:40.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:41 compute-1 sshd-session[283168]: Failed password for root from 84.51.43.58 port 44380 ssh2
Sep 30 18:25:41 compute-1 podman[283196]: 2025-09-30 18:25:41.549608889 +0000 UTC m=+0.094808771 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 18:25:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/874261876' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:25:42 compute-1 nova_compute[238822]: 2025-09-30 18:25:42.190 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:42 compute-1 nova_compute[238822]: 2025-09-30 18:25:42.190 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:42.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:42 compute-1 sshd-session[283168]: Received disconnect from 84.51.43.58 port 44380:11: Bye Bye [preauth]
Sep 30 18:25:42 compute-1 sshd-session[283168]: Disconnected from authenticating user root 84.51.43.58 port 44380 [preauth]
Sep 30 18:25:42 compute-1 nova_compute[238822]: 2025-09-30 18:25:42.699 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:42 compute-1 ceph-mon[75484]: pgmap v1412: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:25:43 compute-1 nova_compute[238822]: 2025-09-30 18:25:43.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:43 compute-1 nova_compute[238822]: 2025-09-30 18:25:43.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:43 compute-1 nova_compute[238822]: 2025-09-30 18:25:43.708 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Check if temp file /var/lib/nova/instances/tmp3yp66xia exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 18:25:43 compute-1 nova_compute[238822]: 2025-09-30 18:25:43.714 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3yp66xia',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='05075776-ca3e-4416-bdd4-558a62d1cf69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 18:25:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:44 compute-1 nova_compute[238822]: 2025-09-30 18:25:44.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:44.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:44 compute-1 ceph-mon[75484]: pgmap v1413: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:25:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:44.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:45 compute-1 nova_compute[238822]: 2025-09-30 18:25:45.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:25:45 compute-1 nova_compute[238822]: 2025-09-30 18:25:45.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:46.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:46 compute-1 ceph-mon[75484]: pgmap v1414: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:25:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:46.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:48 compute-1 nova_compute[238822]: 2025-09-30 18:25:48.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:48.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:48 compute-1 unix_chkpwd[283224]: password check failed for user (root)
Sep 30 18:25:48 compute-1 sshd-session[283220]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:25:48 compute-1 ceph-mon[75484]: pgmap v1415: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 10 KiB/s wr, 1 op/s
Sep 30 18:25:48 compute-1 nova_compute[238822]: 2025-09-30 18:25:48.715 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Preparing to wait for external event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:25:48 compute-1 nova_compute[238822]: 2025-09-30 18:25:48.716 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:48 compute-1 nova_compute[238822]: 2025-09-30 18:25:48.716 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:48 compute-1 nova_compute[238822]: 2025-09-30 18:25:48.716 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:48.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: ERROR   18:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: ERROR   18:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: ERROR   18:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: ERROR   18:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: ERROR   18:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:25:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:25:49 compute-1 podman[283226]: 2025-09-30 18:25:49.538727469 +0000 UTC m=+0.081538454 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 18:25:49 compute-1 podman[283227]: 2025-09-30 18:25:49.56285647 +0000 UTC m=+0.089852178 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Sep 30 18:25:49 compute-1 podman[283228]: 2025-09-30 18:25:49.576783366 +0000 UTC m=+0.099766965 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 18:25:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:49 compute-1 sshd-session[283220]: Failed password for root from 192.210.160.141 port 60794 ssh2
Sep 30 18:25:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:50.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:50 compute-1 ceph-mon[75484]: pgmap v1416: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:25:50 compute-1 nova_compute[238822]: 2025-09-30 18:25:50.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:50.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:51 compute-1 sshd-session[283220]: Connection closed by authenticating user root 192.210.160.141 port 60794 [preauth]
Sep 30 18:25:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:25:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:52.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 18:25:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:25:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:52.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:53 compute-1 ceph-mon[75484]: pgmap v1417: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.509 2 DEBUG nova.compute.manager [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.509 2 DEBUG oslo_concurrency.lockutils [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.510 2 DEBUG oslo_concurrency.lockutils [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.510 2 DEBUG oslo_concurrency.lockutils [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.510 2 DEBUG nova.compute.manager [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No event matching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 in dict_keys([('network-vif-plugged', '295c346d-8de9-4a50-883e-9a7e1ccdccc7')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 18:25:53 compute-1 nova_compute[238822]: 2025-09-30 18:25:53.511 2 DEBUG nova.compute.manager [req-984a469c-3544-4840-af5b-c9024e634bee req-bd652d0c-287f-4368-be62-d5312d2ec31f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:25:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:54 compute-1 sudo[283288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:25:54 compute-1 sudo[283288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:25:54 compute-1 sudo[283288]: pam_unix(sudo:session): session closed for user root
Sep 30 18:25:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:54.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.333 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:25:54 compute-1 nova_compute[238822]: 2025-09-30 18:25:54.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.335 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.336 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.382 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.383 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:25:54.383 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:54 compute-1 ceph-mon[75484]: pgmap v1418: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:25:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:54.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.560 2 DEBUG nova.compute.manager [req-49c5e0b1-e05c-4d2d-80fa-206ec27cb4ef req-1aa5c398-4eaa-4ae4-8d90-47fb06428867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.561 2 DEBUG oslo_concurrency.lockutils [req-49c5e0b1-e05c-4d2d-80fa-206ec27cb4ef req-1aa5c398-4eaa-4ae4-8d90-47fb06428867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.562 2 DEBUG oslo_concurrency.lockutils [req-49c5e0b1-e05c-4d2d-80fa-206ec27cb4ef req-1aa5c398-4eaa-4ae4-8d90-47fb06428867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.562 2 DEBUG oslo_concurrency.lockutils [req-49c5e0b1-e05c-4d2d-80fa-206ec27cb4ef req-1aa5c398-4eaa-4ae4-8d90-47fb06428867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.563 2 DEBUG nova.compute.manager [req-49c5e0b1-e05c-4d2d-80fa-206ec27cb4ef req-1aa5c398-4eaa-4ae4-8d90-47fb06428867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Processing event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:25:55 compute-1 nova_compute[238822]: 2025-09-30 18:25:55.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:56 compute-1 ceph-mon[75484]: pgmap v1419: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:25:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:56.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.273 2 INFO nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Took 8.56 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.274 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:25:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1146159520' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:25:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1146159520' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.621 2 DEBUG nova.compute.manager [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-changed-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.621 2 DEBUG nova.compute.manager [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Refreshing instance network info cache due to event network-changed-295c346d-8de9-4a50-883e-9a7e1ccdccc7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.622 2 DEBUG oslo_concurrency.lockutils [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.622 2 DEBUG oslo_concurrency.lockutils [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.623 2 DEBUG nova.network.neutron [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Refreshing network info cache for port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.782 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3yp66xia',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='05075776-ca3e-4416-bdd4-558a62d1cf69',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(568ac083-778b-4284-9840-3c346f76ab5f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.787 2 DEBUG nova.objects.instance [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'migration_context' on Instance uuid 05075776-ca3e-4416-bdd4-558a62d1cf69 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.788 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.790 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:25:57 compute-1 nova_compute[238822]: 2025-09-30 18:25:57.790 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:25:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.129 2 WARNING neutronclient.v2_0.client [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:25:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:25:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:25:58.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.293 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.294 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.302 2 DEBUG nova.virt.libvirt.vif [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-763414646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-763414646',id=17,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:25:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-u3g6u6z0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:25:11Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=05075776-ca3e-4416-bdd4-558a62d1cf69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.302 2 DEBUG nova.network.os_vif_util [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.303 2 DEBUG nova.network.os_vif_util [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.304 2 DEBUG nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <mac address="fa:16:3e:ec:82:fd"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <model type="virtio"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <mtu size="1442"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <target dev="tap295c346d-8d"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]: </interface>
Sep 30 18:25:58 compute-1 nova_compute[238822]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.305 2 DEBUG nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <name>instance-00000011</name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <uuid>05075776-ca3e-4416-bdd4-558a62d1cf69</uuid>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-763414646</nova:name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:25:03</nova:creationTime>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:port uuid="295c346d-8de9-4a50-883e-9a7e1ccdccc7">
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="serial">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="uuid">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:ec:82:fd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap295c346d-8d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </target>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </console>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </input>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]: </domain>
Sep 30 18:25:58 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.307 2 DEBUG nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <name>instance-00000011</name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <uuid>05075776-ca3e-4416-bdd4-558a62d1cf69</uuid>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-763414646</nova:name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:25:03</nova:creationTime>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:port uuid="295c346d-8de9-4a50-883e-9a7e1ccdccc7">
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="serial">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="uuid">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:ec:82:fd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap295c346d-8d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </target>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </console>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </input>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]: </domain>
Sep 30 18:25:58 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.308 2 DEBUG nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <name>instance-00000011</name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <uuid>05075776-ca3e-4416-bdd4-558a62d1cf69</uuid>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-763414646</nova:name>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:25:03</nova:creationTime>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <nova:port uuid="295c346d-8de9-4a50-883e-9a7e1ccdccc7">
Sep 30 18:25:58 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="serial">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="uuid">05075776-ca3e-4416-bdd4-558a62d1cf69</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </system>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </os>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </features>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/05075776-ca3e-4416-bdd4-558a62d1cf69_disk.config">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </source>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:ec:82:fd"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap295c346d-8d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:25:58 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       </target>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69/console.log" append="off"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </console>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </input>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </video>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:25:58 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:25:58 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:25:58 compute-1 nova_compute[238822]: </domain>
Sep 30 18:25:58 compute-1 nova_compute[238822]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.308 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 18:25:58 compute-1 ceph-mon[75484]: pgmap v1420: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 10 KiB/s wr, 1 op/s
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.663 2 WARNING neutronclient.v2_0.client [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:25:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.797 2 DEBUG nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Current None elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.798 2 INFO nova.virt.libvirt.migration [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.821 2 DEBUG nova.network.neutron [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Updated VIF entry in instance network info cache for port 295c346d-8de9-4a50-883e-9a7e1ccdccc7. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:25:58 compute-1 nova_compute[238822]: 2025-09-30 18:25:58.821 2 DEBUG nova.network.neutron [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Updating instance_info_cache with network_info: [{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:25:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:25:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:25:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:25:58.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:25:59 compute-1 nova_compute[238822]: 2025-09-30 18:25:59.327 2 DEBUG oslo_concurrency.lockutils [req-54a48281-b3e3-437f-a605-ea9cdc0bbb5f req-b195a0a7-64ad-4a32-bb0a-dec4fcad684f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-05075776-ca3e-4416-bdd4-558a62d1cf69" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:25:59 compute-1 sshd-session[283319]: Invalid user testadmin from 216.10.242.161 port 47366
Sep 30 18:25:59 compute-1 sshd-session[283319]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:25:59 compute-1 sshd-session[283319]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:25:59 compute-1 nova_compute[238822]: 2025-09-30 18:25:59.821 2 INFO nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 18:25:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:25:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:25:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:25:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:00 compute-1 kernel: tap295c346d-8d (unregistering): left promiscuous mode
Sep 30 18:26:00 compute-1 NetworkManager[45549]: <info>  [1759256760.0971] device (tap295c346d-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00142|binding|INFO|Releasing lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 from this chassis (sb_readonly=0)
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00143|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 down in Southbound
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00144|binding|INFO|Removing iface tap295c346d-8d ovn-installed in OVS
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.126 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:82:fd 10.100.0.9'], port_security=['fa:16:3e:ec:82:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0398922-aff5-46ba-afa7-58d09e28293c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '05075776-ca3e-4416-bdd4-558a62d1cf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=295c346d-8de9-4a50-883e-9a7e1ccdccc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.128 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.131 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.137 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f9394ca1-a9e3-4b5b-b0a7-326227828e4a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.138 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Sep 30 18:26:00 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 15.883s CPU time.
Sep 30 18:26:00 compute-1 systemd-machined[195911]: Machine qemu-12-instance-00000011 terminated.
Sep 30 18:26:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:00.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:00 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_selinux on 05075776-ca3e-4416-bdd4-558a62d1cf69_disk: No such file or directory
Sep 30 18:26:00 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_dac on 05075776-ca3e-4416-bdd4-558a62d1cf69_disk: No such file or directory
Sep 30 18:26:00 compute-1 kernel: tap295c346d-8d: entered promiscuous mode
Sep 30 18:26:00 compute-1 NetworkManager[45549]: <info>  [1759256760.2555] manager: (tap295c346d-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Sep 30 18:26:00 compute-1 systemd-udevd[283330]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00145|binding|INFO|Claiming lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 for this chassis.
Sep 30 18:26:00 compute-1 kernel: tap295c346d-8d (unregistering): left promiscuous mode
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00146|binding|INFO|295c346d-8de9-4a50-883e-9a7e1ccdccc7: Claiming fa:16:3e:ec:82:fd 10.100.0.9
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.266 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:82:fd 10.100.0.9'], port_security=['fa:16:3e:ec:82:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0398922-aff5-46ba-afa7-58d09e28293c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '05075776-ca3e-4416-bdd4-558a62d1cf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=295c346d-8de9-4a50-883e-9a7e1ccdccc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.299 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.300 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.300 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00147|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 ovn-installed in OVS
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00148|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 up in Southbound
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00149|binding|INFO|Releasing lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 from this chassis (sb_readonly=1)
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00150|if_status|INFO|Not setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 down as sb is readonly
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00151|binding|INFO|Removing iface tap295c346d-8d ovn-installed in OVS
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00152|binding|INFO|Releasing lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 from this chassis (sb_readonly=0)
Sep 30 18:26:00 compute-1 ovn_controller[135204]: 2025-09-30T18:26:00Z|00153|binding|INFO|Setting lport 295c346d-8de9-4a50-883e-9a7e1ccdccc7 down in Southbound
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.315 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:82:fd 10.100.0.9'], port_security=['fa:16:3e:ec:82:fd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0398922-aff5-46ba-afa7-58d09e28293c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '05075776-ca3e-4416-bdd4-558a62d1cf69', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=295c346d-8de9-4a50-883e-9a7e1ccdccc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.329 2 DEBUG nova.virt.libvirt.guest [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '05075776-ca3e-4416-bdd4-558a62d1cf69' (instance-00000011) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.330 2 INFO nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migration operation has completed
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.330 2 INFO nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] _post_live_migration() is started..
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [NOTICE]   (282841) : haproxy version is 3.0.5-8e879a5
Sep 30 18:26:00 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [NOTICE]   (282841) : path to executable is /usr/sbin/haproxy
Sep 30 18:26:00 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [WARNING]  (282841) : Exiting Master process...
Sep 30 18:26:00 compute-1 podman[283351]: 2025-09-30 18:26:00.336790199 +0000 UTC m=+0.067023321 container kill aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:26:00 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [ALERT]    (282841) : Current worker (282847) exited with code 143 (Terminated)
Sep 30 18:26:00 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[282818]: [WARNING]  (282841) : All workers exited. Exiting... (0)
Sep 30 18:26:00 compute-1 systemd[1]: libpod-aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560.scope: Deactivated successfully.
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.350 2 WARNING neutronclient.v2_0.client [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.351 2 WARNING neutronclient.v2_0.client [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:26:00 compute-1 podman[283369]: 2025-09-30 18:26:00.408510766 +0000 UTC m=+0.042824278 container died aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 18:26:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-2885721e14c4034d090c6f96a472a689002bca060a056b1a3c984b602defe5d5-merged.mount: Deactivated successfully.
Sep 30 18:26:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560-userdata-shm.mount: Deactivated successfully.
Sep 30 18:26:00 compute-1 podman[283369]: 2025-09-30 18:26:00.46049341 +0000 UTC m=+0.094806852 container cleanup aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:26:00 compute-1 systemd[1]: libpod-conmon-aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560.scope: Deactivated successfully.
Sep 30 18:26:00 compute-1 podman[283376]: 2025-09-30 18:26:00.486799061 +0000 UTC m=+0.096445316 container remove aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:26:00 compute-1 ceph-mon[75484]: pgmap v1421: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 2.4 KiB/s wr, 0 op/s
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.496 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5afb1ff1-d324-4a24-a0f0-758450331e3c]: (4, ("Tue Sep 30 06:26:00 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560)\naa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560\nTue Sep 30 06:26:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (aa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560)\naa8f40cbe3d6a138adf2ef84fce845218c21019336a873538fb62aae5c798560\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.497 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4556068d-affb-4b89-bcc7-216f80f3d47c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.498 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.499 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cf84c199-71ae-4f6d-b159-498d3ec54f65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.501 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.514 2 DEBUG nova.compute.manager [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.514 2 DEBUG oslo_concurrency.lockutils [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.515 2 DEBUG oslo_concurrency.lockutils [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.515 2 DEBUG oslo_concurrency.lockutils [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.515 2 DEBUG nova.compute.manager [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.516 2 DEBUG nova.compute.manager [req-8a060be9-b447-4b4b-b1f7-3b0ad0ea0029 req-ae5ad05b-90cd-4467-a426-9b7e8eab608d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.537 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a3891973-448a-4e6b-a773-97b10c231f0d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.574 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[27857a9a-c4f5-4394-ba1a-0f054e530332]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.575 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8e50a60a-eb53-4635-8f14-38de21f838fa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.603 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[301860e5-979b-43ed-9811-86b5bfc644ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1455072, 'reachable_time': 29700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283404, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.606 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.607 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e3cb5c-0f5b-4f62-8f81-0f0f9e132a7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.607 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.609 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.610 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[20b5e872-ded7-49d7-9976-88964fcc2227]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.611 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.613 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:26:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:00.613 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[55a8bf49-6acb-4de1-afcf-9c368531b290]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.732 2 DEBUG nova.compute.manager [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.732 2 DEBUG oslo_concurrency.lockutils [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.733 2 DEBUG oslo_concurrency.lockutils [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.733 2 DEBUG oslo_concurrency.lockutils [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.733 2 DEBUG nova.compute.manager [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.734 2 DEBUG nova.compute.manager [req-9c44038f-bda0-46d6-bb5a-db4e7d618145 req-7eb3cd6f-1dbf-4012-9371-16af6ce0e06e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.864 2 DEBUG nova.network.neutron [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Activated binding for port 295c346d-8de9-4a50-883e-9a7e1ccdccc7 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.865 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.867 2 DEBUG nova.virt.libvirt.vif [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-763414646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-763414646',id=17,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:25:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-u3g6u6z0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:25:39Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=05075776-ca3e-4416-bdd4-558a62d1cf69,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.867 2 DEBUG nova.network.os_vif_util [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "address": "fa:16:3e:ec:82:fd", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap295c346d-8d", "ovs_interfaceid": "295c346d-8de9-4a50-883e-9a7e1ccdccc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:26:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.868 2 DEBUG nova.network.os_vif_util [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.869 2 DEBUG os_vif [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.872 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap295c346d-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.877 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b4208cf0-74bf-4a81-a443-aad29da88d64) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.882 2 INFO os_vif [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:82:fd,bridge_name='br-int',has_traffic_filtering=True,id=295c346d-8de9-4a50-883e-9a7e1ccdccc7,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap295c346d-8d')
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.883 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.883 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.884 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.884 2 DEBUG nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.885 2 INFO nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Deleting instance files /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69_del
Sep 30 18:26:00 compute-1 nova_compute[238822]: 2025-09-30 18:26:00.885 2 INFO nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Deletion of /var/lib/nova/instances/05075776-ca3e-4416-bdd4-558a62d1cf69_del complete
Sep 30 18:26:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:00.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:01 compute-1 sshd-session[283319]: Failed password for invalid user testadmin from 216.10.242.161 port 47366 ssh2
Sep 30 18:26:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:02.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:02 compute-1 ceph-mon[75484]: pgmap v1422: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.565 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.566 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.566 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.566 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.567 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.567 2 WARNING nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state migrating.
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.567 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.568 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.568 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.568 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.569 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.569 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.569 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.570 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.570 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.570 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.571 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.571 2 WARNING nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state migrating.
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.571 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.571 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.572 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.572 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.572 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.572 2 WARNING nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state migrating.
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.573 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.573 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.573 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.574 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.574 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.574 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.575 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.575 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.575 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.576 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.576 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.576 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-unplugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.577 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.577 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.577 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.577 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.578 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.578 2 WARNING nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state migrating.
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.578 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.578 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.579 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.579 2 DEBUG oslo_concurrency.lockutils [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.579 2 DEBUG nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] No waiting events found dispatching network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:26:02 compute-1 nova_compute[238822]: 2025-09-30 18:26:02.580 2 WARNING nova.compute.manager [req-74db98f6-4362-467e-bf0c-9a623babdf6a req-59785fdb-3bb2-4024-86a3-b1a5852a2b1e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Received unexpected event network-vif-plugged-295c346d-8de9-4a50-883e-9a7e1ccdccc7 for instance with vm_state active and task_state migrating.
Sep 30 18:26:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:02.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:03 compute-1 nova_compute[238822]: 2025-09-30 18:26:03.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:04 compute-1 sshd-session[283319]: Received disconnect from 216.10.242.161 port 47366:11: Bye Bye [preauth]
Sep 30 18:26:04 compute-1 sshd-session[283319]: Disconnected from invalid user testadmin 216.10.242.161 port 47366 [preauth]
Sep 30 18:26:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:04.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:04 compute-1 ceph-mon[75484]: pgmap v1423: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:26:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:04.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:05 compute-1 podman[249638]: time="2025-09-30T18:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:26:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:26:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8355 "" "Go-http-client/1.1"
Sep 30 18:26:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:05 compute-1 nova_compute[238822]: 2025-09-30 18:26:05.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:06.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:06 compute-1 ceph-mon[75484]: pgmap v1424: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:26:06 compute-1 podman[283412]: 2025-09-30 18:26:06.555796434 +0000 UTC m=+0.085716185 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:26:06 compute-1 podman[283411]: 2025-09-30 18:26:06.594702515 +0000 UTC m=+0.128155222 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:26:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:06.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:26:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:08 compute-1 nova_compute[238822]: 2025-09-30 18:26:08.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:08.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:08 compute-1 ceph-mon[75484]: pgmap v1425: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:26:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:08.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.423 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.424 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.424 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "05075776-ca3e-4416-bdd4-558a62d1cf69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.940 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.941 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.941 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.941 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:26:09 compute-1 nova_compute[238822]: 2025-09-30 18:26:09.942 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:10.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:26:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1603123180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.441 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:10 compute-1 unix_chkpwd[283492]: password check failed for user (root)
Sep 30 18:26:10 compute-1 sshd-session[283466]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:26:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1603123180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:10 compute-1 ceph-mon[75484]: pgmap v1426: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 0 B/s wr, 5 op/s
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.725 2 WARNING nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.728 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.767 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.768 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4698MB free_disk=39.90113830566406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.769 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.769 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:10 compute-1 nova_compute[238822]: 2025-09-30 18:26:10.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:10.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:11 compute-1 nova_compute[238822]: 2025-09-30 18:26:11.800 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration for instance 05075776-ca3e-4416-bdd4-558a62d1cf69 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:26:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:12.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.311 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.343 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration 568ac083-778b-4284-9840-3c346f76ab5f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.344 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.344 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:26:10 up  4:03,  0 user,  load average: 0.34, 0.43, 0.76\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.385 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:12 compute-1 ceph-mon[75484]: pgmap v1427: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 0 B/s wr, 5 op/s
Sep 30 18:26:12 compute-1 podman[283497]: 2025-09-30 18:26:12.572538077 +0000 UTC m=+0.102507779 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Sep 30 18:26:12 compute-1 sshd-session[283466]: Failed password for root from 175.126.165.170 port 46278 ssh2
Sep 30 18:26:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:26:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/459333809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.875 2 DEBUG oslo_concurrency.processutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:12 compute-1 nova_compute[238822]: 2025-09-30 18:26:12.884 2 DEBUG nova.compute.provider_tree [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:26:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:12.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:13 compute-1 nova_compute[238822]: 2025-09-30 18:26:13.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:13 compute-1 nova_compute[238822]: 2025-09-30 18:26:13.399 2 DEBUG nova.scheduler.client.report [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:26:13 compute-1 sshd-session[283466]: Received disconnect from 175.126.165.170 port 46278:11: Bye Bye [preauth]
Sep 30 18:26:13 compute-1 sshd-session[283466]: Disconnected from authenticating user root 175.126.165.170 port 46278 [preauth]
Sep 30 18:26:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/459333809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:13 compute-1 nova_compute[238822]: 2025-09-30 18:26:13.911 2 DEBUG nova.compute.resource_tracker [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:26:13 compute-1 nova_compute[238822]: 2025-09-30 18:26:13.912 2 DEBUG oslo_concurrency.lockutils [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:13 compute-1 nova_compute[238822]: 2025-09-30 18:26:13.932 2 INFO nova.compute.manager [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 18:26:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:14 compute-1 sudo[283541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:26:14 compute-1 sudo[283541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:14 compute-1 sudo[283541]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:14 compute-1 ceph-mon[75484]: pgmap v1428: 353 pgs: 353 active+clean; 159 MiB data, 316 MiB used, 40 GiB / 40 GiB avail; 16 KiB/s rd, 8.5 KiB/s wr, 22 op/s
Sep 30 18:26:14 compute-1 unix_chkpwd[283567]: password check failed for user (root)
Sep 30 18:26:14 compute-1 sshd-session[283536]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:26:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:14.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:15 compute-1 nova_compute[238822]: 2025-09-30 18:26:15.016 2 INFO nova.scheduler.client.report [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Deleted allocation for migration 568ac083-778b-4284-9840-3c346f76ab5f
Sep 30 18:26:15 compute-1 nova_compute[238822]: 2025-09-30 18:26:15.017 2 DEBUG nova.virt.libvirt.driver [None req-0b86d113-34eb-4c36-aeb5-1552c729aee1 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 05075776-ca3e-4416-bdd4-558a62d1cf69] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 18:26:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:15 compute-1 nova_compute[238822]: 2025-09-30 18:26:15.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:16 compute-1 sudo[283569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:26:16 compute-1 sudo[283569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:16 compute-1 sudo[283569]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:16 compute-1 ceph-mon[75484]: pgmap v1429: 353 pgs: 353 active+clean; 159 MiB data, 316 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 8.5 KiB/s wr, 17 op/s
Sep 30 18:26:16 compute-1 sshd-session[283536]: Failed password for root from 192.210.160.141 port 56654 ssh2
Sep 30 18:26:16 compute-1 sudo[283594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:26:16 compute-1 sudo[283594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:16.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:17 compute-1 sudo[283594]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:26:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:26:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:18 compute-1 sshd-session[283536]: Connection closed by authenticating user root 192.210.160.141 port 56654 [preauth]
Sep 30 18:26:18 compute-1 nova_compute[238822]: 2025-09-30 18:26:18.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:18.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:18 compute-1 ceph-mon[75484]: pgmap v1430: 353 pgs: 353 active+clean; 121 MiB data, 292 MiB used, 40 GiB / 40 GiB avail; 15 KiB/s rd, 9.2 KiB/s wr, 24 op/s
Sep 30 18:26:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:18.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: ERROR   18:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: ERROR   18:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: ERROR   18:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: ERROR   18:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: ERROR   18:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:26:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:26:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:20.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:20 compute-1 ceph-mon[75484]: pgmap v1431: 353 pgs: 353 active+clean; 121 MiB data, 292 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 9.2 KiB/s wr, 29 op/s
Sep 30 18:26:20 compute-1 podman[283655]: 2025-09-30 18:26:20.579119393 +0000 UTC m=+0.114258507 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 18:26:20 compute-1 podman[283657]: 2025-09-30 18:26:20.616200035 +0000 UTC m=+0.134415722 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 18:26:20 compute-1 podman[283656]: 2025-09-30 18:26:20.617528781 +0000 UTC m=+0.139651623 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Sep 30 18:26:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:20 compute-1 nova_compute[238822]: 2025-09-30 18:26:20.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:20.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:22.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:26:22 compute-1 sudo[283715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:26:22 compute-1 sudo[283715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:22 compute-1 sudo[283715]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:22.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:23 compute-1 nova_compute[238822]: 2025-09-30 18:26:23.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:23 compute-1 ceph-mon[75484]: pgmap v1432: 353 pgs: 353 active+clean; 121 MiB data, 292 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 9.2 KiB/s wr, 29 op/s
Sep 30 18:26:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:26:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:26:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:24.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1156298293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:24.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:25 compute-1 ceph-mon[75484]: pgmap v1433: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 10 KiB/s wr, 57 op/s
Sep 30 18:26:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:25 compute-1 nova_compute[238822]: 2025-09-30 18:26:25.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:26.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:26 compute-1 unix_chkpwd[283746]: password check failed for user (root)
Sep 30 18:26:26 compute-1 sshd-session[283743]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:26:26 compute-1 ceph-mon[75484]: pgmap v1434: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 40 op/s
Sep 30 18:26:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:27 compute-1 nova_compute[238822]: 2025-09-30 18:26:27.684 2 DEBUG nova.compute.manager [None req-baf4147b-a5ae-4518-ae9d-beebfbd430bb e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 18:26:27 compute-1 nova_compute[238822]: 2025-09-30 18:26:27.737 2 DEBUG nova.compute.provider_tree [None req-baf4147b-a5ae-4518-ae9d-beebfbd430bb e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 20 to 23 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:26:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:28 compute-1 nova_compute[238822]: 2025-09-30 18:26:28.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:28 compute-1 sshd-session[283743]: Failed password for root from 14.225.167.110 port 41700 ssh2
Sep 30 18:26:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:28.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:28 compute-1 ceph-mon[75484]: pgmap v1435: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 40 op/s
Sep 30 18:26:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:28.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:29 compute-1 sshd-session[283743]: Received disconnect from 14.225.167.110 port 41700:11: Bye Bye [preauth]
Sep 30 18:26:29 compute-1 sshd-session[283743]: Disconnected from authenticating user root 14.225.167.110 port 41700 [preauth]
Sep 30 18:26:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:30.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:30 compute-1 nova_compute[238822]: 2025-09-30 18:26:30.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:31.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:31 compute-1 ceph-mon[75484]: pgmap v1436: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.2 KiB/s wr, 33 op/s
Sep 30 18:26:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:32.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:33.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:33 compute-1 nova_compute[238822]: 2025-09-30 18:26:33.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:33 compute-1 nova_compute[238822]: 2025-09-30 18:26:33.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:26:33 compute-1 nova_compute[238822]: 2025-09-30 18:26:33.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:33 compute-1 ceph-mon[75484]: pgmap v1437: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:26:33 compute-1 unix_chkpwd[283756]: password check failed for user (root)
Sep 30 18:26:33 compute-1 sshd-session[283754]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:26:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:34 compute-1 nova_compute[238822]: 2025-09-30 18:26:34.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:34.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:34 compute-1 sudo[283758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:26:34 compute-1 sudo[283758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:34 compute-1 sudo[283758]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:35.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:35 compute-1 ceph-mon[75484]: pgmap v1438: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:26:35 compute-1 podman[249638]: time="2025-09-30T18:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:26:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:26:35 compute-1 sshd-session[283754]: Failed password for root from 8.243.64.201 port 58028 ssh2
Sep 30 18:26:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8352 "" "Go-http-client/1.1"
Sep 30 18:26:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:35 compute-1 nova_compute[238822]: 2025-09-30 18:26:35.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:36.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:36 compute-1 sshd-session[283754]: Received disconnect from 8.243.64.201 port 58028:11: Bye Bye [preauth]
Sep 30 18:26:36 compute-1 sshd-session[283754]: Disconnected from authenticating user root 8.243.64.201 port 58028 [preauth]
Sep 30 18:26:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4202001670' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:26:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4202001670' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:26:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:37.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:37 compute-1 podman[283787]: 2025-09-30 18:26:37.5659425 +0000 UTC m=+0.096451716 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:26:37 compute-1 ceph-mon[75484]: pgmap v1439: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.580 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.580 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.581 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.581 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:26:37 compute-1 nova_compute[238822]: 2025-09-30 18:26:37.581 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:37 compute-1 podman[283786]: 2025-09-30 18:26:37.650131874 +0000 UTC m=+0.186468437 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 18:26:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:26:38 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3175932376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.145 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.564s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:38.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.374 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.376 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.411 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.412 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4746MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.413 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:38 compute-1 nova_compute[238822]: 2025-09-30 18:26:38.413 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3175932376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:39 compute-1 nova_compute[238822]: 2025-09-30 18:26:39.468 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:26:39 compute-1 nova_compute[238822]: 2025-09-30 18:26:39.469 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:26:38 up  4:04,  0 user,  load average: 0.26, 0.40, 0.74\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:26:39 compute-1 nova_compute[238822]: 2025-09-30 18:26:39.482 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:26:39 compute-1 ceph-mon[75484]: pgmap v1440: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:26:39 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3534214684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:39 compute-1 nova_compute[238822]: 2025-09-30 18:26:39.997 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:26:40 compute-1 nova_compute[238822]: 2025-09-30 18:26:40.003 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:26:40 compute-1 unix_chkpwd[283883]: password check failed for user (root)
Sep 30 18:26:40 compute-1 sshd-session[283857]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:26:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:40.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:40 compute-1 nova_compute[238822]: 2025-09-30 18:26:40.511 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:26:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3534214684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1560769864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:41 compute-1 nova_compute[238822]: 2025-09-30 18:26:41.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:41.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:41 compute-1 nova_compute[238822]: 2025-09-30 18:26:41.022 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:26:41 compute-1 nova_compute[238822]: 2025-09-30 18:26:41.023 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.610s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:41 compute-1 ceph-mon[75484]: pgmap v1441: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:42 compute-1 sshd-session[283884]: Invalid user minecraft from 103.153.190.105 port 44802
Sep 30 18:26:42 compute-1 sshd-session[283884]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:26:42 compute-1 sshd-session[283884]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:26:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:42.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:42 compute-1 sshd-session[283857]: Failed password for root from 192.210.160.141 port 45936 ssh2
Sep 30 18:26:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3568289352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:43.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:43 compute-1 nova_compute[238822]: 2025-09-30 18:26:43.024 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:43 compute-1 nova_compute[238822]: 2025-09-30 18:26:43.024 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:43 compute-1 nova_compute[238822]: 2025-09-30 18:26:43.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:43 compute-1 sshd-session[283857]: Connection closed by authenticating user root 192.210.160.141 port 45936 [preauth]
Sep 30 18:26:43 compute-1 podman[283889]: 2025-09-30 18:26:43.553254117 +0000 UTC m=+0.093533107 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:26:43 compute-1 ceph-mon[75484]: pgmap v1442: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:44.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:45 compute-1 sshd-session[283884]: Failed password for invalid user minecraft from 103.153.190.105 port 44802 ssh2
Sep 30 18:26:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:45.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:45 compute-1 nova_compute[238822]: 2025-09-30 18:26:45.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:45 compute-1 nova_compute[238822]: 2025-09-30 18:26:45.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:45 compute-1 nova_compute[238822]: 2025-09-30 18:26:45.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:26:45 compute-1 ceph-mon[75484]: pgmap v1443: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 0 op/s
Sep 30 18:26:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/941839852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:26:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:46 compute-1 nova_compute[238822]: 2025-09-30 18:26:46.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:46.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:46 compute-1 sshd-session[283884]: Received disconnect from 103.153.190.105 port 44802:11: Bye Bye [preauth]
Sep 30 18:26:46 compute-1 sshd-session[283884]: Disconnected from invalid user minecraft 103.153.190.105 port 44802 [preauth]
Sep 30 18:26:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:47 compute-1 ceph-mon[75484]: pgmap v1444: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:48 compute-1 nova_compute[238822]: 2025-09-30 18:26:48.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:48.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:49.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: ERROR   18:26:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: ERROR   18:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: ERROR   18:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: ERROR   18:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: ERROR   18:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:26:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:26:49 compute-1 ceph-mon[75484]: pgmap v1445: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:50.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:51.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:51 compute-1 nova_compute[238822]: 2025-09-30 18:26:51.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:51 compute-1 podman[283919]: 2025-09-30 18:26:51.556240005 +0000 UTC m=+0.089973471 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:26:51 compute-1 podman[283918]: 2025-09-30 18:26:51.563339117 +0000 UTC m=+0.095014587 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Sep 30 18:26:51 compute-1 podman[283917]: 2025-09-30 18:26:51.569697399 +0000 UTC m=+0.110429684 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:26:51 compute-1 ceph-mon[75484]: pgmap v1446: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:52.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:26:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:53.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:53 compute-1 nova_compute[238822]: 2025-09-30 18:26:53.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:53 compute-1 ceph-mon[75484]: pgmap v1447: 353 pgs: 353 active+clean; 41 MiB data, 243 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:26:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2570281823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:26:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/114652845' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:26:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:54.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:54.384 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:26:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:54.385 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:26:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:26:54.385 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:26:54 compute-1 sudo[283978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:26:54 compute-1 sudo[283978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:26:54 compute-1 sudo[283978]: pam_unix(sudo:session): session closed for user root
Sep 30 18:26:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:55.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:55 compute-1 ceph-mon[75484]: pgmap v1448: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:26:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:56 compute-1 nova_compute[238822]: 2025-09-30 18:26:56.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:56.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:57.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:57 compute-1 ceph-mon[75484]: pgmap v1449: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:26:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3463281651' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:26:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3463281651' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:26:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:58 compute-1 nova_compute[238822]: 2025-09-30 18:26:58.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:26:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:26:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:26:58.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:26:58 compute-1 sshd-session[284006]: Invalid user npm from 84.51.43.58 port 50216
Sep 30 18:26:58 compute-1 sshd-session[284006]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:26:58 compute-1 sshd-session[284006]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:26:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:26:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:26:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:26:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:26:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:26:59 compute-1 ceph-mon[75484]: pgmap v1450: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 420 KiB/s rd, 1.8 MiB/s wr, 42 op/s
Sep 30 18:26:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:26:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:26:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:26:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:00.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:01 compute-1 sshd-session[284006]: Failed password for invalid user npm from 84.51.43.58 port 50216 ssh2
Sep 30 18:27:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:01.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:01 compute-1 nova_compute[238822]: 2025-09-30 18:27:01.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:01 compute-1 ceph-mon[75484]: pgmap v1451: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 787 KiB/s rd, 1.8 MiB/s wr, 62 op/s
Sep 30 18:27:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:02.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:02 compute-1 sshd-session[284006]: Received disconnect from 84.51.43.58 port 50216:11: Bye Bye [preauth]
Sep 30 18:27:02 compute-1 sshd-session[284006]: Disconnected from invalid user npm 84.51.43.58 port 50216 [preauth]
Sep 30 18:27:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:03.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:03 compute-1 nova_compute[238822]: 2025-09-30 18:27:03.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:03 compute-1 ceph-mon[75484]: pgmap v1452: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 787 KiB/s rd, 1.8 MiB/s wr, 62 op/s
Sep 30 18:27:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:27:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:27:04 compute-1 nova_compute[238822]: 2025-09-30 18:27:04.697 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:04 compute-1 nova_compute[238822]: 2025-09-30 18:27:04.698 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:05.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:05 compute-1 nova_compute[238822]: 2025-09-30 18:27:05.208 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:27:05 compute-1 unix_chkpwd[284020]: password check failed for user (root)
Sep 30 18:27:05 compute-1 sshd-session[284014]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:27:05 compute-1 podman[249638]: time="2025-09-30T18:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:27:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:27:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8346 "" "Go-http-client/1.1"
Sep 30 18:27:05 compute-1 nova_compute[238822]: 2025-09-30 18:27:05.794 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:05 compute-1 nova_compute[238822]: 2025-09-30 18:27:05.794 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:05 compute-1 nova_compute[238822]: 2025-09-30 18:27:05.803 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:27:05 compute-1 nova_compute[238822]: 2025-09-30 18:27:05.804 2 INFO nova.compute.claims [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:27:05 compute-1 sshd-session[284017]: Invalid user ftpuser from 216.10.242.161 port 40766
Sep 30 18:27:05 compute-1 sshd-session[284017]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:27:05 compute-1 sshd-session[284017]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:27:05 compute-1 ceph-mon[75484]: pgmap v1453: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Sep 30 18:27:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:06 compute-1 nova_compute[238822]: 2025-09-30 18:27:06.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:06 compute-1 nova_compute[238822]: 2025-09-30 18:27:06.864 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.886093) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256826886136, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2071, "num_deletes": 251, "total_data_size": 4931688, "memory_usage": 5011952, "flush_reason": "Manual Compaction"}
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256826912328, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3204166, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39385, "largest_seqno": 41451, "table_properties": {"data_size": 3196021, "index_size": 4895, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17410, "raw_average_key_size": 20, "raw_value_size": 3179510, "raw_average_value_size": 3679, "num_data_blocks": 215, "num_entries": 864, "num_filter_entries": 864, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256646, "oldest_key_time": 1759256646, "file_creation_time": 1759256826, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 26334 microseconds, and 12567 cpu microseconds.
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.912423) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3204166 bytes OK
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.912450) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.917844) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.917859) EVENT_LOG_v1 {"time_micros": 1759256826917854, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.917882) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 4922436, prev total WAL file size 4922436, number of live WAL files 2.
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.919150) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3129KB)], [75(12MB)]
Sep 30 18:27:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256826919263, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15820108, "oldest_snapshot_seqno": -1}
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6555 keys, 13826132 bytes, temperature: kUnknown
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256827005340, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 13826132, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13782338, "index_size": 26274, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 169220, "raw_average_key_size": 25, "raw_value_size": 13664517, "raw_average_value_size": 2084, "num_data_blocks": 1052, "num_entries": 6555, "num_filter_entries": 6555, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256826, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.005700) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 13826132 bytes
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.007229) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.6 rd, 160.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 12.0 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(9.3) write-amplify(4.3) OK, records in: 7071, records dropped: 516 output_compression: NoCompression
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.007258) EVENT_LOG_v1 {"time_micros": 1759256827007245, "job": 46, "event": "compaction_finished", "compaction_time_micros": 86170, "compaction_time_cpu_micros": 48987, "output_level": 6, "num_output_files": 1, "total_output_size": 13826132, "num_input_records": 7071, "num_output_records": 6555, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256827008405, "job": 46, "event": "table_file_deletion", "file_number": 77}
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256827012379, "job": 46, "event": "table_file_deletion", "file_number": 75}
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:06.919041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.012515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.012523) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.012525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.012527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:27:07.012529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:27:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:07.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:27:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3567038494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:07 compute-1 nova_compute[238822]: 2025-09-30 18:27:07.333 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:07 compute-1 nova_compute[238822]: 2025-09-30 18:27:07.340 2 DEBUG nova.compute.provider_tree [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:27:07 compute-1 nova_compute[238822]: 2025-09-30 18:27:07.851 2 DEBUG nova.scheduler.client.report [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:27:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:07 compute-1 ceph-mon[75484]: pgmap v1454: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:27:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3567038494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:27:08 compute-1 sshd-session[284014]: Failed password for root from 192.210.160.141 port 44882 ssh2
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:27:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:08.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.363 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.569s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.364 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:27:08 compute-1 sshd-session[284017]: Failed password for invalid user ftpuser from 216.10.242.161 port 40766 ssh2
Sep 30 18:27:08 compute-1 sshd-session[284014]: Connection closed by authenticating user root 192.210.160.141 port 44882 [preauth]
Sep 30 18:27:08 compute-1 podman[284047]: 2025-09-30 18:27:08.553592149 +0000 UTC m=+0.089068147 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:27:08 compute-1 podman[284046]: 2025-09-30 18:27:08.599768746 +0000 UTC m=+0.137735371 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 18:27:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.892 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.892 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.893 2 WARNING neutronclient.v2_0.client [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:08 compute-1 nova_compute[238822]: 2025-09-30 18:27:08.893 2 WARNING neutronclient.v2_0.client [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:08 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:27:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:09.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:09 compute-1 sshd-session[284017]: Received disconnect from 216.10.242.161 port 40766:11: Bye Bye [preauth]
Sep 30 18:27:09 compute-1 sshd-session[284017]: Disconnected from invalid user ftpuser 216.10.242.161 port 40766 [preauth]
Sep 30 18:27:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:09.267 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:27:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:09.268 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:27:09 compute-1 nova_compute[238822]: 2025-09-30 18:27:09.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:09 compute-1 nova_compute[238822]: 2025-09-30 18:27:09.398 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Successfully created port: 03023543-2a7a-4ad4-b500-cf06571e617c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:27:09 compute-1 nova_compute[238822]: 2025-09-30 18:27:09.402 2 INFO nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:27:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:09 compute-1 ceph-mon[75484]: pgmap v1455: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:27:09 compute-1 nova_compute[238822]: 2025-09-30 18:27:09.912 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:27:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:10.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:10 compute-1 nova_compute[238822]: 2025-09-30 18:27:10.931 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:27:10 compute-1 nova_compute[238822]: 2025-09-30 18:27:10.932 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:27:10 compute-1 nova_compute[238822]: 2025-09-30 18:27:10.933 2 INFO nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Creating image(s)
Sep 30 18:27:10 compute-1 nova_compute[238822]: 2025-09-30 18:27:10.963 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.004 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.033 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.037 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:11.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.133 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.134 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.135 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.136 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.172 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.177 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 277581fe-2194-4331-bf7b-c2604b65125e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.541 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 277581fe-2194-4331-bf7b-c2604b65125e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.363s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.637 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] resizing rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.763 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.764 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Ensure instance console log exists: /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.765 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.765 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:11 compute-1 nova_compute[238822]: 2025-09-30 18:27:11.766 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:11 compute-1 ceph-mon[75484]: pgmap v1456: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 18 KiB/s wr, 63 op/s
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.148 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Successfully updated port: 03023543-2a7a-4ad4-b500-cf06571e617c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.241 2 DEBUG nova.compute.manager [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-changed-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.241 2 DEBUG nova.compute.manager [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Refreshing instance network info cache due to event network-changed-03023543-2a7a-4ad4-b500-cf06571e617c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.242 2 DEBUG oslo_concurrency.lockutils [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.242 2 DEBUG oslo_concurrency.lockutils [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.242 2 DEBUG nova.network.neutron [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Refreshing network info cache for port 03023543-2a7a-4ad4-b500-cf06571e617c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:27:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:12.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.659 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:27:12 compute-1 nova_compute[238822]: 2025-09-30 18:27:12.750 2 WARNING neutronclient.v2_0.client [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:13.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:13 compute-1 nova_compute[238822]: 2025-09-30 18:27:13.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:13 compute-1 nova_compute[238822]: 2025-09-30 18:27:13.355 2 DEBUG nova.network.neutron [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:27:13 compute-1 nova_compute[238822]: 2025-09-30 18:27:13.550 2 DEBUG nova.network.neutron [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:27:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:13 compute-1 ceph-mon[75484]: pgmap v1457: 353 pgs: 353 active+clean; 88 MiB data, 264 MiB used, 40 GiB / 40 GiB avail; 1.2 MiB/s rd, 4.5 KiB/s wr, 43 op/s
Sep 30 18:27:14 compute-1 nova_compute[238822]: 2025-09-30 18:27:14.062 2 DEBUG oslo_concurrency.lockutils [req-40290699-764b-49a9-b3a6-bee567d1b26b req-ba87f10d-0e7c-4a32-9ad2-0d53e160adeb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:27:14 compute-1 nova_compute[238822]: 2025-09-30 18:27:14.062 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquired lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:27:14 compute-1 nova_compute[238822]: 2025-09-30 18:27:14.063 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:27:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:14 compute-1 podman[284269]: 2025-09-30 18:27:14.422368927 +0000 UTC m=+0.064357809 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 18:27:14 compute-1 sudo[284290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:27:14 compute-1 sudo[284290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:14 compute-1 sudo[284290]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:15.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:15 compute-1 nova_compute[238822]: 2025-09-30 18:27:15.410 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:27:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:15 compute-1 ceph-mon[75484]: pgmap v1458: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 3.9 MiB/s wr, 128 op/s
Sep 30 18:27:16 compute-1 nova_compute[238822]: 2025-09-30 18:27:16.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:16.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:16 compute-1 nova_compute[238822]: 2025-09-30 18:27:16.380 2 WARNING neutronclient.v2_0.client [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:16 compute-1 nova_compute[238822]: 2025-09-30 18:27:16.653 2 DEBUG nova.network.neutron [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Updating instance_info_cache with network_info: [{"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:27:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:17.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.164 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Releasing lock "refresh_cache-277581fe-2194-4331-bf7b-c2604b65125e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.164 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance network_info: |[{"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.166 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Start _get_guest_xml network_info=[{"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.170 2 WARNING nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.171 2 DEBUG nova.virt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-322504074', uuid='277581fe-2194-4331-bf7b-c2604b65125e'), owner=OwnerMeta(userid='623ef4a55c9e4fc28bb65e49246b5008', username='tempest-TestExecuteStrategies-1883747907-project-admin', projectid='c634e1c17ed54907969576a0eb8eff50', projectname='tempest-TestExecuteStrategies-1883747907'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256837.1712348) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.175 2 DEBUG nova.virt.libvirt.host [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.176 2 DEBUG nova.virt.libvirt.host [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.180 2 DEBUG nova.virt.libvirt.host [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.180 2 DEBUG nova.virt.libvirt.host [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.181 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.181 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.182 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.183 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.183 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.184 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.184 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.185 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.185 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.186 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.186 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.186 2 DEBUG nova.virt.hardware [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.191 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:17.269 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:27:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/719707145' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.690 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.728 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:17 compute-1 nova_compute[238822]: 2025-09-30 18:27:17.733 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:17 compute-1 ceph-mon[75484]: pgmap v1459: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:27:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/719707145' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:27:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1840254682' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.202 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.204 2 DEBUG nova.virt.libvirt.vif [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-322504074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-322504074',id=19,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-0a1uu87b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:27:09Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=277581fe-2194-4331-bf7b-c2604b65125e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.205 2 DEBUG nova.network.os_vif_util [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.206 2 DEBUG nova.network.os_vif_util [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.208 2 DEBUG nova.objects.instance [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 277581fe-2194-4331-bf7b-c2604b65125e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:27:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:18.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.720 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <uuid>277581fe-2194-4331-bf7b-c2604b65125e</uuid>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <name>instance-00000013</name>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-322504074</nova:name>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:27:17</nova:creationTime>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:27:18 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:27:18 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <nova:port uuid="03023543-2a7a-4ad4-b500-cf06571e617c">
Sep 30 18:27:18 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <system>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="serial">277581fe-2194-4331-bf7b-c2604b65125e</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="uuid">277581fe-2194-4331-bf7b-c2604b65125e</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </system>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <os>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </os>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <features>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </features>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/277581fe-2194-4331-bf7b-c2604b65125e_disk">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/277581fe-2194-4331-bf7b-c2604b65125e_disk.config">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:27:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:45:e1:4f"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <target dev="tap03023543-2a"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/console.log" append="off"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <video>
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </video>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:27:18 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:27:18 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:27:18 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:27:18 compute-1 nova_compute[238822]: </domain>
Sep 30 18:27:18 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.722 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Preparing to wait for external event network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.722 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.723 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.723 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.724 2 DEBUG nova.virt.libvirt.vif [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-322504074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-322504074',id=19,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-0a1uu87b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:27:09Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=277581fe-2194-4331-bf7b-c2604b65125e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.725 2 DEBUG nova.network.os_vif_util [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.725 2 DEBUG nova.network.os_vif_util [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.725 2 DEBUG os_vif [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3e1a37be-0c64-5a82-8060-b516b03014d8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.738 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03023543-2a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap03023543-2a, col_values=(('qos', UUID('7c0218ba-22bc-4d8e-b023-1182c7a5ed96')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap03023543-2a, col_values=(('external_ids', {'iface-id': '03023543-2a7a-4ad4-b500-cf06571e617c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:e1:4f', 'vm-uuid': '277581fe-2194-4331-bf7b-c2604b65125e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:18 compute-1 NetworkManager[45549]: <info>  [1759256838.7438] manager: (tap03023543-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:27:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:18 compute-1 nova_compute[238822]: 2025-09-30 18:27:18.756 2 INFO os_vif [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a')
Sep 30 18:27:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1840254682' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:27:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:19.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: ERROR   18:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: ERROR   18:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: ERROR   18:27:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: ERROR   18:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: ERROR   18:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:27:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:27:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:19 compute-1 ceph-mon[75484]: pgmap v1460: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.318 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.319 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.319 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No VIF found with MAC fa:16:3e:45:e1:4f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.320 2 INFO nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Using config drive
Sep 30 18:27:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.356 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:20 compute-1 nova_compute[238822]: 2025-09-30 18:27:20.876 2 WARNING neutronclient.v2_0.client [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:21.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.404 2 INFO nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Creating config drive at /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.414 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpcpz51739 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.561 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpcpz51739" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.607 2 DEBUG nova.storage.rbd_utils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 277581fe-2194-4331-bf7b-c2604b65125e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.612 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config 277581fe-2194-4331-bf7b-c2604b65125e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.830 2 DEBUG oslo_concurrency.processutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config 277581fe-2194-4331-bf7b-c2604b65125e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.218s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:21 compute-1 nova_compute[238822]: 2025-09-30 18:27:21.831 2 INFO nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Deleting local config drive /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e/disk.config because it was imported into RBD.
Sep 30 18:27:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:21 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:27:21 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:27:21 compute-1 podman[284448]: 2025-09-30 18:27:21.990453598 +0000 UTC m=+0.094383460 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:27:21 compute-1 podman[284446]: 2025-09-30 18:27:21.990991672 +0000 UTC m=+0.098714477 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:27:21 compute-1 kernel: tap03023543-2a: entered promiscuous mode
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.0021] manager: (tap03023543-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Sep 30 18:27:22 compute-1 ovn_controller[135204]: 2025-09-30T18:27:22Z|00154|binding|INFO|Claiming lport 03023543-2a7a-4ad4-b500-cf06571e617c for this chassis.
Sep 30 18:27:22 compute-1 ovn_controller[135204]: 2025-09-30T18:27:22Z|00155|binding|INFO|03023543-2a7a-4ad4-b500-cf06571e617c: Claiming fa:16:3e:45:e1:4f 10.100.0.12
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 podman[284447]: 2025-09-30 18:27:22.019720168 +0000 UTC m=+0.123421354 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 18:27:22 compute-1 ceph-mon[75484]: pgmap v1461: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.024 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e1:4f 10.100.0.12'], port_security=['fa:16:3e:45:e1:4f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '277581fe-2194-4331-bf7b-c2604b65125e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=03023543-2a7a-4ad4-b500-cf06571e617c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.025 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 03023543-2a7a-4ad4-b500-cf06571e617c in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 bound to our chassis
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.027 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:27:22 compute-1 ovn_controller[135204]: 2025-09-30T18:27:22Z|00156|binding|INFO|Setting lport 03023543-2a7a-4ad4-b500-cf06571e617c ovn-installed in OVS
Sep 30 18:27:22 compute-1 ovn_controller[135204]: 2025-09-30T18:27:22Z|00157|binding|INFO|Setting lport 03023543-2a7a-4ad4-b500-cf06571e617c up in Southbound
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.043 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[39ea321c-9da5-4c9c-b08d-6cc980600593]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.044 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.046 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.047 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d323299b-34d6-4ecf-b7bf-edc7bc61c464]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.047 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f3cc67-ccf2-414f-a054-b56b94930db5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 systemd-udevd[284530]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.062 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3e361a-10d3-4177-8a4d-8f9f1b40cfc1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.0733] device (tap03023543-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.0748] device (tap03023543-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:27:22 compute-1 systemd-machined[195911]: New machine qemu-13-instance-00000013.
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.078 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2af56f00-506d-488c-9019-5db2bd7be445]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.114 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[fced27ad-ec4d-4ac3-baf8-c5ac90981f38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.1209] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.121 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[08a22ee5-5379-452e-8947-38b70462a809]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.158 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d99c8750-2335-424d-98e1-fb6c1f33dbfa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.161 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[cb47298a-ba12-46e3-a28a-0c8ff52e21bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.1874] device (tap6901f664-30): carrier: link connected
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.191 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[52510027-c41c-4219-b75f-b1ee544a37b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.206 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6ec5bb-4ff2-4300-892c-00eac3de071e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1468376, 'reachable_time': 19099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284564, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.220 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbd5100-f00f-4397-ad77-5c90ba8f001a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1468376, 'tstamp': 1468376}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284565, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.235 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a9049b1f-a6b4-479e-9375-c1642970306e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1468376, 'reachable_time': 19099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284566, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.259 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae0685b-61c1-4ffc-97ad-a11602babf50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.310 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfea7b3-9db8-445f-b8cc-7c8ff65df631]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.311 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.311 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.312 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:27:22 compute-1 NetworkManager[45549]: <info>  [1759256842.3164] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.319 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 ovn_controller[135204]: 2025-09-30T18:27:22Z|00158|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:27:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.350 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[70564e5a-a2e5-444d-94b6-8b52c1991e5c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.351 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.351 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.352 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.352 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.352 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1d0f5a-6c3c-4833-86ce-5179efa3a9b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.353 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.353 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[59453e43-5420-4404-aeb8-07ee81a0d787]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.354 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:27:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:22.355 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.529 2 DEBUG nova.compute.manager [req-4c0939ef-3af8-475a-8175-9de884b0cf7c req-e00af0a9-1f79-4c51-8bd6-dc320f5b6f6f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.530 2 DEBUG oslo_concurrency.lockutils [req-4c0939ef-3af8-475a-8175-9de884b0cf7c req-e00af0a9-1f79-4c51-8bd6-dc320f5b6f6f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.530 2 DEBUG oslo_concurrency.lockutils [req-4c0939ef-3af8-475a-8175-9de884b0cf7c req-e00af0a9-1f79-4c51-8bd6-dc320f5b6f6f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.531 2 DEBUG oslo_concurrency.lockutils [req-4c0939ef-3af8-475a-8175-9de884b0cf7c req-e00af0a9-1f79-4c51-8bd6-dc320f5b6f6f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:22 compute-1 nova_compute[238822]: 2025-09-30 18:27:22.531 2 DEBUG nova.compute.manager [req-4c0939ef-3af8-475a-8175-9de884b0cf7c req-e00af0a9-1f79-4c51-8bd6-dc320f5b6f6f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Processing event network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:27:22 compute-1 unix_chkpwd[284581]: password check failed for user (root)
Sep 30 18:27:22 compute-1 sshd-session[284407]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:27:22 compute-1 podman[284599]: 2025-09-30 18:27:22.825099661 +0000 UTC m=+0.077321900 container create 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 18:27:22 compute-1 podman[284599]: 2025-09-30 18:27:22.786384995 +0000 UTC m=+0.038607254 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:27:22 compute-1 systemd[1]: Started libpod-conmon-53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b.scope.
Sep 30 18:27:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:27:22 compute-1 sudo[284613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:27:22 compute-1 sudo[284613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:22 compute-1 sudo[284613]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9c256e033802b2d778797931e048cc39c7717dac5c85132c237ba18ed41446e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:27:22 compute-1 podman[284599]: 2025-09-30 18:27:22.964831835 +0000 UTC m=+0.217054064 container init 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:27:22 compute-1 podman[284599]: 2025-09-30 18:27:22.97094439 +0000 UTC m=+0.223166599 container start 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:27:22 compute-1 sshd-session[284655]: Invalid user gpadmin from 167.71.248.239 port 36254
Sep 30 18:27:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [NOTICE]   (284694) : New worker (284706) forked
Sep 30 18:27:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [NOTICE]   (284694) : Loading success.
Sep 30 18:27:23 compute-1 sshd-session[284655]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:27:23 compute-1 sshd-session[284655]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 18:27:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:27:23 compute-1 sudo[284673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:27:23 compute-1 sudo[284673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:23.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.580 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.589 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.594 2 INFO nova.virt.libvirt.driver [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance spawned successfully.
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.595 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:27:23 compute-1 sudo[284673]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:23 compute-1 nova_compute[238822]: 2025-09-30 18:27:23.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:24 compute-1 ceph-mon[75484]: pgmap v1462: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 3.9 MiB/s wr, 85 op/s
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:27:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.111 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.112 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.112 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.113 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.113 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.114 2 DEBUG nova.virt.libvirt.driver [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:27:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.598 2 DEBUG nova.compute.manager [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.599 2 DEBUG oslo_concurrency.lockutils [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.599 2 DEBUG oslo_concurrency.lockutils [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.599 2 DEBUG oslo_concurrency.lockutils [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.600 2 DEBUG nova.compute.manager [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] No waiting events found dispatching network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.600 2 WARNING nova.compute.manager [req-ef83b636-483c-4c12-9f04-e8d28cf99ce1 req-3089e980-e4d7-4a6a-b56e-40303cc5a81d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received unexpected event network-vif-plugged-03023543-2a7a-4ad4-b500-cf06571e617c for instance with vm_state building and task_state spawning.
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.627 2 INFO nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Took 13.70 seconds to spawn the instance on the hypervisor.
Sep 30 18:27:24 compute-1 nova_compute[238822]: 2025-09-30 18:27:24.628 2 DEBUG nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:27:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:24 compute-1 sshd-session[284407]: Failed password for root from 175.126.165.170 port 47360 ssh2
Sep 30 18:27:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:25.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:25 compute-1 nova_compute[238822]: 2025-09-30 18:27:25.165 2 INFO nova.compute.manager [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Took 19.45 seconds to build instance.
Sep 30 18:27:25 compute-1 sshd-session[284655]: Failed password for invalid user gpadmin from 167.71.248.239 port 36254 ssh2
Sep 30 18:27:25 compute-1 sshd-session[284407]: Received disconnect from 175.126.165.170 port 47360:11: Bye Bye [preauth]
Sep 30 18:27:25 compute-1 sshd-session[284407]: Disconnected from authenticating user root 175.126.165.170 port 47360 [preauth]
Sep 30 18:27:25 compute-1 nova_compute[238822]: 2025-09-30 18:27:25.674 2 DEBUG oslo_concurrency.lockutils [None req-be434173-16d8-4b3e-89ea-89553f66826b 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.976s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:26 compute-1 ceph-mon[75484]: pgmap v1463: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 332 KiB/s rd, 3.9 MiB/s wr, 95 op/s
Sep 30 18:27:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:26.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:26 compute-1 sshd-session[284655]: Connection closed by invalid user gpadmin 167.71.248.239 port 36254 [preauth]
Sep 30 18:27:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:27.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:28 compute-1 ceph-mon[75484]: pgmap v1464: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 7.3 KiB/s rd, 26 KiB/s wr, 10 op/s
Sep 30 18:27:28 compute-1 nova_compute[238822]: 2025-09-30 18:27:28.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:27:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:28.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:27:28 compute-1 nova_compute[238822]: 2025-09-30 18:27:28.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:28 compute-1 sudo[284762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:27:28 compute-1 sudo[284762]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:28 compute-1 sudo[284762]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:29.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:29 compute-1 ceph-mon[75484]: pgmap v1465: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.1 MiB/s rd, 26 KiB/s wr, 47 op/s
Sep 30 18:27:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:27:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:27:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:30.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:30 compute-1 unix_chkpwd[284790]: password check failed for user (root)
Sep 30 18:27:30 compute-1 sshd-session[284761]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:27:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:31.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:31 compute-1 ceph-mon[75484]: pgmap v1466: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 74 op/s
Sep 30 18:27:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:32.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:33 compute-1 sshd-session[284761]: Failed password for root from 192.210.160.141 port 39122 ssh2
Sep 30 18:27:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:33.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:33 compute-1 nova_compute[238822]: 2025-09-30 18:27:33.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:33 compute-1 ceph-mon[75484]: pgmap v1467: 353 pgs: 353 active+clean; 167 MiB data, 310 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Sep 30 18:27:33 compute-1 sshd-session[284761]: Connection closed by authenticating user root 192.210.160.141 port 39122 [preauth]
Sep 30 18:27:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:33 compute-1 nova_compute[238822]: 2025-09-30 18:27:33.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:34 compute-1 nova_compute[238822]: 2025-09-30 18:27:34.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:34 compute-1 nova_compute[238822]: 2025-09-30 18:27:34.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:27:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:34.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:34 compute-1 sudo[284795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:27:34 compute-1 sudo[284795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:34 compute-1 sudo[284795]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:35.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:35 compute-1 ceph-mon[75484]: pgmap v1468: 353 pgs: 353 active+clean; 167 MiB data, 311 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:27:35 compute-1 podman[249638]: time="2025-09-30T18:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:27:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:27:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8809 "" "Go-http-client/1.1"
Sep 30 18:27:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:36 compute-1 nova_compute[238822]: 2025-09-30 18:27:36.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:36.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:36 compute-1 ovn_controller[135204]: 2025-09-30T18:27:36Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:e1:4f 10.100.0.12
Sep 30 18:27:36 compute-1 ovn_controller[135204]: 2025-09-30T18:27:36Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:e1:4f 10.100.0.12
Sep 30 18:27:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/800605053' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:27:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/800605053' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:27:36 compute-1 sshd-session[284821]: Invalid user pi from 185.156.73.233 port 27048
Sep 30 18:27:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:36 compute-1 sshd-session[284821]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:27:36 compute-1 sshd-session[284821]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 18:27:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:37.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:37 compute-1 ceph-mon[75484]: pgmap v1469: 353 pgs: 353 active+clean; 167 MiB data, 311 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 64 op/s
Sep 30 18:27:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:27:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:38 compute-1 nova_compute[238822]: 2025-09-30 18:27:38.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:38.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:38 compute-1 nova_compute[238822]: 2025-09-30 18:27:38.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:39.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:39 compute-1 podman[284830]: 2025-09-30 18:27:39.557665265 +0000 UTC m=+0.095506810 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:27:39 compute-1 nova_compute[238822]: 2025-09-30 18:27:39.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:39 compute-1 podman[284829]: 2025-09-30 18:27:39.586498294 +0000 UTC m=+0.129344804 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 18:27:39 compute-1 ceph-mon[75484]: pgmap v1470: 353 pgs: 353 active+clean; 188 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.2 MiB/s wr, 107 op/s
Sep 30 18:27:39 compute-1 sshd-session[284821]: Failed password for invalid user pi from 185.156.73.233 port 27048 ssh2
Sep 30 18:27:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:27:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/957335910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:40 compute-1 nova_compute[238822]: 2025-09-30 18:27:40.055 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:40 compute-1 sshd-session[284821]: Connection closed by invalid user pi 185.156.73.233 port 27048 [preauth]
Sep 30 18:27:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:27:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:40.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:27:40 compute-1 sshd-session[284827]: Invalid user seekcy from 14.225.167.110 port 53648
Sep 30 18:27:40 compute-1 sshd-session[284827]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:27:40 compute-1 sshd-session[284827]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:27:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/957335910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:41.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.127 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.127 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000013 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.374 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.375 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.400 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.401 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4500MB free_disk=39.91184616088867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.402 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:41 compute-1 nova_compute[238822]: 2025-09-30 18:27:41.402 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:41 compute-1 ceph-mon[75484]: pgmap v1471: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 1.1 MiB/s rd, 2.1 MiB/s wr, 90 op/s
Sep 30 18:27:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2859010351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.095 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Creating tmpfile /var/lib/nova/instances/tmpextsh8xs to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.097 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.173 2 DEBUG nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpextsh8xs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:27:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:42.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.450 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 277581fe-2194-4331-bf7b-c2604b65125e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:27:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:42 compute-1 sshd-session[284827]: Failed password for invalid user seekcy from 14.225.167.110 port 53648 ssh2
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.960 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance d188c0fb-8668-4ab2-b174-49e0e20505ba has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.960 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:27:42 compute-1 nova_compute[238822]: 2025-09-30 18:27:42.960 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:27:41 up  4:05,  0 user,  load average: 0.52, 0.42, 0.72\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:27:42 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:27:42 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.016 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:27:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:27:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:43.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:27:43 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3835132069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.474 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.481 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:27:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:43 compute-1 ceph-mon[75484]: pgmap v1472: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Sep 30 18:27:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3835132069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:43 compute-1 nova_compute[238822]: 2025-09-30 18:27:43.991 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:27:44 compute-1 nova_compute[238822]: 2025-09-30 18:27:44.215 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:44.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:44 compute-1 nova_compute[238822]: 2025-09-30 18:27:44.505 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:27:44 compute-1 nova_compute[238822]: 2025-09-30 18:27:44.506 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:44 compute-1 sshd-session[284827]: Received disconnect from 14.225.167.110 port 53648:11: Bye Bye [preauth]
Sep 30 18:27:44 compute-1 sshd-session[284827]: Disconnected from invalid user seekcy 14.225.167.110 port 53648 [preauth]
Sep 30 18:27:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4068927978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:27:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:45 compute-1 podman[284931]: 2025-09-30 18:27:45.551004857 +0000 UTC m=+0.085826849 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:27:45 compute-1 ceph-mon[75484]: pgmap v1473: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:27:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:46.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:46 compute-1 nova_compute[238822]: 2025-09-30 18:27:46.502 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:46 compute-1 nova_compute[238822]: 2025-09-30 18:27:46.502 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:47 compute-1 nova_compute[238822]: 2025-09-30 18:27:47.013 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:47 compute-1 nova_compute[238822]: 2025-09-30 18:27:47.014 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:47 compute-1 nova_compute[238822]: 2025-09-30 18:27:47.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:47 compute-1 nova_compute[238822]: 2025-09-30 18:27:47.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:27:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:27:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:47.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:27:47 compute-1 ceph-mon[75484]: pgmap v1474: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:27:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:48 compute-1 unix_chkpwd[284955]: password check failed for user (root)
Sep 30 18:27:48 compute-1 sshd-session[284953]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:27:48 compute-1 nova_compute[238822]: 2025-09-30 18:27:48.156 2 DEBUG nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpextsh8xs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d188c0fb-8668-4ab2-b174-49e0e20505ba',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:27:48 compute-1 nova_compute[238822]: 2025-09-30 18:27:48.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:48.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:48 compute-1 nova_compute[238822]: 2025-09-30 18:27:48.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:49.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:49 compute-1 nova_compute[238822]: 2025-09-30 18:27:49.176 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:27:49 compute-1 nova_compute[238822]: 2025-09-30 18:27:49.177 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:27:49 compute-1 nova_compute[238822]: 2025-09-30 18:27:49.177 2 DEBUG nova.network.neutron [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: ERROR   18:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: ERROR   18:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: ERROR   18:27:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: ERROR   18:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: ERROR   18:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:27:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:27:49 compute-1 nova_compute[238822]: 2025-09-30 18:27:49.686 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:50 compute-1 ceph-mon[75484]: pgmap v1475: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:27:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:50.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:50 compute-1 nova_compute[238822]: 2025-09-30 18:27:50.717 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:50 compute-1 sshd-session[284953]: Failed password for root from 8.243.64.201 port 40088 ssh2
Sep 30 18:27:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:51.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:51 compute-1 nova_compute[238822]: 2025-09-30 18:27:51.682 2 DEBUG nova.network.neutron [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Updating instance_info_cache with network_info: [{"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:27:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:52 compute-1 ovn_controller[135204]: 2025-09-30T18:27:52Z|00159|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.192 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.216 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpextsh8xs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d188c0fb-8668-4ab2-b174-49e0e20505ba',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.217 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Creating instance directory: /var/lib/nova/instances/d188c0fb-8668-4ab2-b174-49e0e20505ba pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.217 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Ensure instance console log exists: /var/lib/nova/instances/d188c0fb-8668-4ab2-b174-49e0e20505ba/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.218 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.218 2 DEBUG nova.virt.libvirt.vif [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:26:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1528882058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1528882058',id=18,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:26:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-1gsulz3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:26:58Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=d188c0fb-8668-4ab2-b174-49e0e20505ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.219 2 DEBUG nova.network.os_vif_util [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.220 2 DEBUG nova.network.os_vif_util [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.220 2 DEBUG os_vif [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '59bbf51e-6ca6-5dd3-bbfe-36699f40d458', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac8e3da5-cb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapac8e3da5-cb, col_values=(('qos', UUID('04d1d318-a1c8-4d38-a3f0-1d6f84c98c73')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.231 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapac8e3da5-cb, col_values=(('external_ids', {'iface-id': 'ac8e3da5-cb09-4223-89d5-318d077ea35e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:e0:b1', 'vm-uuid': 'd188c0fb-8668-4ab2-b174-49e0e20505ba'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 NetworkManager[45549]: <info>  [1759256872.2334] manager: (tapac8e3da5-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.246 2 INFO os_vif [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb')
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.246 2 DEBUG nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.247 2 DEBUG nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpextsh8xs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d188c0fb-8668-4ab2-b174-49e0e20505ba',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.248 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:52.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:52 compute-1 ceph-mon[75484]: pgmap v1476: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 138 KiB/s rd, 947 KiB/s wr, 21 op/s
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.412 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:27:52 compute-1 podman[284963]: 2025-09-30 18:27:52.557536582 +0000 UTC m=+0.083128526 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=watcher_latest, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:27:52 compute-1 podman[284964]: 2025-09-30 18:27:52.567403448 +0000 UTC m=+0.084408530 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal)
Sep 30 18:27:52 compute-1 podman[284965]: 2025-09-30 18:27:52.576788382 +0000 UTC m=+0.087893055 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:27:52 compute-1 nova_compute[238822]: 2025-09-30 18:27:52.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:52.697 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:27:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:52.699 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:27:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:53 compute-1 nova_compute[238822]: 2025-09-30 18:27:53.030 2 DEBUG nova.network.neutron [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Port ac8e3da5-cb09-4223-89d5-318d077ea35e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:27:53 compute-1 nova_compute[238822]: 2025-09-30 18:27:53.044 2 DEBUG nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpextsh8xs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d188c0fb-8668-4ab2-b174-49e0e20505ba',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:27:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:53.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:53 compute-1 nova_compute[238822]: 2025-09-30 18:27:53.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:53 compute-1 sshd-session[284953]: Received disconnect from 8.243.64.201 port 40088:11: Bye Bye [preauth]
Sep 30 18:27:53 compute-1 sshd-session[284953]: Disconnected from authenticating user root 8.243.64.201 port 40088 [preauth]
Sep 30 18:27:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:27:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:54.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:54.386 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:27:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:54.386 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:27:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:54.387 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:27:54 compute-1 ceph-mon[75484]: pgmap v1477: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 15 KiB/s wr, 0 op/s
Sep 30 18:27:54 compute-1 sudo[285028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:27:54 compute-1 sudo[285028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:27:54 compute-1 sudo[285028]: pam_unix(sudo:session): session closed for user root
Sep 30 18:27:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:55.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:55 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:27:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:55 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:27:56 compute-1 kernel: tapac8e3da5-cb: entered promiscuous mode
Sep 30 18:27:56 compute-1 NetworkManager[45549]: <info>  [1759256876.0687] manager: (tapac8e3da5-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Sep 30 18:27:56 compute-1 nova_compute[238822]: 2025-09-30 18:27:56.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:56 compute-1 ovn_controller[135204]: 2025-09-30T18:27:56Z|00160|binding|INFO|Claiming lport ac8e3da5-cb09-4223-89d5-318d077ea35e for this additional chassis.
Sep 30 18:27:56 compute-1 ovn_controller[135204]: 2025-09-30T18:27:56Z|00161|binding|INFO|ac8e3da5-cb09-4223-89d5-318d077ea35e: Claiming fa:16:3e:bb:e0:b1 10.100.0.14
Sep 30 18:27:56 compute-1 ceph-mon[75484]: pgmap v1478: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.079 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e0:b1 10.100.0.14'], port_security=['fa:16:3e:bb:e0:b1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd188c0fb-8668-4ab2-b174-49e0e20505ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ac8e3da5-cb09-4223-89d5-318d077ea35e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.082 144543 INFO neutron.agent.ovn.metadata.agent [-] Port ac8e3da5-cb09-4223-89d5-318d077ea35e in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.083 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:27:56 compute-1 ovn_controller[135204]: 2025-09-30T18:27:56Z|00162|binding|INFO|Setting lport ac8e3da5-cb09-4223-89d5-318d077ea35e ovn-installed in OVS
Sep 30 18:27:56 compute-1 nova_compute[238822]: 2025-09-30 18:27:56.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:56 compute-1 nova_compute[238822]: 2025-09-30 18:27:56.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.106 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0b2b36-6d7d-42e2-892a-e040b88fc6ac]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 systemd-udevd[285087]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:27:56 compute-1 systemd-machined[195911]: New machine qemu-14-instance-00000012.
Sep 30 18:27:56 compute-1 NetworkManager[45549]: <info>  [1759256876.1431] device (tapac8e3da5-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:27:56 compute-1 NetworkManager[45549]: <info>  [1759256876.1444] device (tapac8e3da5-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:27:56 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000012.
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.162 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[e12b183a-270d-44dd-8d5c-b14f1e3ec3c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.166 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[96a680e8-6d92-4eb3-afd6-079fb37b81bc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.210 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb8473a-2b2a-4b38-84c6-6e21710076bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.238 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3339f8-57f9-4879-930e-8e7a0344985f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1468376, 'reachable_time': 19099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285098, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.259 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb063a2-9e69-4b8b-9072-178e2b8c2084]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1468386, 'tstamp': 1468386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285100, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1468388, 'tstamp': 1468388}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285100, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.261 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:56 compute-1 nova_compute[238822]: 2025-09-30 18:27:56.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:56 compute-1 nova_compute[238822]: 2025-09-30 18:27:56.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.265 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.266 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.266 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.266 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:27:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:56.268 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ab573d84-e95e-4d5d-9424-13789a806655]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:27:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:56.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:56 compute-1 sshd-session[285026]: Invalid user deploy from 192.210.160.141 port 58098
Sep 30 18:27:56 compute-1 sshd-session[285026]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:27:56 compute-1 sshd-session[285026]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:27:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:57.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:57 compute-1 nova_compute[238822]: 2025-09-30 18:27:57.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:58 compute-1 ceph-mon[75484]: pgmap v1479: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 0 op/s
Sep 30 18:27:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4203785252' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:27:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4203785252' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:27:58 compute-1 sshd-session[285026]: Failed password for invalid user deploy from 192.210.160.141 port 58098 ssh2
Sep 30 18:27:58 compute-1 nova_compute[238822]: 2025-09-30 18:27:58.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:27:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:27:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:27:58.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:27:58 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:27:58.702 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:27:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:27:59 compute-1 ovn_controller[135204]: 2025-09-30T18:27:59Z|00163|binding|INFO|Claiming lport ac8e3da5-cb09-4223-89d5-318d077ea35e for this chassis.
Sep 30 18:27:59 compute-1 ovn_controller[135204]: 2025-09-30T18:27:59Z|00164|binding|INFO|ac8e3da5-cb09-4223-89d5-318d077ea35e: Claiming fa:16:3e:bb:e0:b1 10.100.0.14
Sep 30 18:27:59 compute-1 ovn_controller[135204]: 2025-09-30T18:27:59Z|00165|binding|INFO|Setting lport ac8e3da5-cb09-4223-89d5-318d077ea35e up in Southbound
Sep 30 18:27:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:27:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:27:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:27:59.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:27:59 compute-1 sshd-session[285026]: Connection closed by invalid user deploy 192.210.160.141 port 58098 [preauth]
Sep 30 18:27:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:27:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Cumulative writes: 8270 writes, 41K keys, 8270 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.03 MB/s
                                           Cumulative WAL: 8270 writes, 8270 syncs, 1.00 writes per sync, written: 0.09 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1516 writes, 7316 keys, 1516 commit groups, 1.0 writes per commit group, ingest: 15.85 MB, 0.03 MB/s
                                           Interval WAL: 1516 writes, 1516 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    134.5      0.41              0.22        23    0.018       0      0       0.0       0.0
                                             L6      1/0   13.19 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.5    179.6    153.6      1.60              0.82        22    0.073    123K    12K       0.0       0.0
                                            Sum      1/0   13.19 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.5    143.1    149.7      2.01              1.04        45    0.045    123K    12K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.2    162.3    165.1      0.41              0.24        10    0.041     34K   3004       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    179.6    153.6      1.60              0.82        22    0.073    123K    12K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    135.2      0.40              0.22        22    0.018       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.053, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.29 GB write, 0.10 MB/s write, 0.28 GB read, 0.10 MB/s read, 2.0 seconds
                                           Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 28.67 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000212 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1622,27.77 MB,9.13512%) FilterBlock(45,344.86 KB,0.110782%) IndexBlock(45,577.52 KB,0.18552%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:27:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:27:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:27:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:27:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:00 compute-1 ceph-mon[75484]: pgmap v1480: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 9.2 KiB/s wr, 3 op/s
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.199 2 INFO nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Post operation of migration started
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.201 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:00.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.430 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.431 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.529 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.530 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:28:00 compute-1 nova_compute[238822]: 2025-09-30 18:28:00.530 2 DEBUG nova.network.neutron [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:28:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:01 compute-1 nova_compute[238822]: 2025-09-30 18:28:01.043 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:01.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:01 compute-1 nova_compute[238822]: 2025-09-30 18:28:01.709 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:01 compute-1 nova_compute[238822]: 2025-09-30 18:28:01.847 2 DEBUG nova.network.neutron [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Updating instance_info_cache with network_info: [{"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:28:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:02 compute-1 ceph-mon[75484]: pgmap v1481: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 9.2 KiB/s wr, 6 op/s
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.354 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-d188c0fb-8668-4ab2-b174-49e0e20505ba" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:28:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:02.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.883 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.884 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.884 2 DEBUG oslo_concurrency.lockutils [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:02 compute-1 nova_compute[238822]: 2025-09-30 18:28:02.891 2 INFO nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:28:02 compute-1 virtqemud[239124]: Domain id=14 name='instance-00000012' uuid=d188c0fb-8668-4ab2-b174-49e0e20505ba is tainted: custom-monitor
Sep 30 18:28:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:03.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:03 compute-1 nova_compute[238822]: 2025-09-30 18:28:03.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:03 compute-1 nova_compute[238822]: 2025-09-30 18:28:03.899 2 INFO nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:28:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:04 compute-1 ceph-mon[75484]: pgmap v1482: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 9.2 KiB/s wr, 6 op/s
Sep 30 18:28:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:04.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:04 compute-1 nova_compute[238822]: 2025-09-30 18:28:04.908 2 INFO nova.virt.libvirt.driver [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:28:04 compute-1 nova_compute[238822]: 2025-09-30 18:28:04.913 2 DEBUG nova.compute.manager [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:28:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:05.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:05 compute-1 nova_compute[238822]: 2025-09-30 18:28:05.427 2 DEBUG nova.objects.instance [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:28:05 compute-1 podman[249638]: time="2025-09-30T18:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:28:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:28:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8804 "" "Go-http-client/1.1"
Sep 30 18:28:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:06 compute-1 ceph-mon[75484]: pgmap v1483: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.6 KiB/s rd, 9.2 KiB/s wr, 6 op/s
Sep 30 18:28:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:06.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:06 compute-1 nova_compute[238822]: 2025-09-30 18:28:06.444 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:07.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:07 compute-1 nova_compute[238822]: 2025-09-30 18:28:07.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:28:07 compute-1 nova_compute[238822]: 2025-09-30 18:28:07.345 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:07 compute-1 nova_compute[238822]: 2025-09-30 18:28:07.346 2 WARNING neutronclient.v2_0.client [None req-5277a133-1ae2-45fd-93c2-84ac17b4e412 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:08 compute-1 ceph-mon[75484]: pgmap v1484: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 9.2 KiB/s wr, 6 op/s
Sep 30 18:28:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:28:08 compute-1 nova_compute[238822]: 2025-09-30 18:28:08.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:08.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:09.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1208046519' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:10 compute-1 ceph-mon[75484]: pgmap v1485: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 4.5 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:28:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:10.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:10 compute-1 podman[285159]: 2025-09-30 18:28:10.550945668 +0000 UTC m=+0.081623946 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:28:10 compute-1 podman[285158]: 2025-09-30 18:28:10.582280574 +0000 UTC m=+0.125897691 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:28:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:11.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:12 compute-1 nova_compute[238822]: 2025-09-30 18:28:12.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:12 compute-1 ceph-mon[75484]: pgmap v1486: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 2.4 KiB/s rd, 2.3 KiB/s wr, 3 op/s
Sep 30 18:28:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1707156152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:12.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:13 compute-1 sshd-session[285209]: Invalid user ftpclient from 216.10.242.161 port 55750
Sep 30 18:28:13 compute-1 sshd-session[285209]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:28:13 compute-1 sshd-session[285209]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.366 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.366 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.367 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.367 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.367 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.382 2 INFO nova.compute.manager [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Terminating instance
Sep 30 18:28:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.901 2 DEBUG nova.compute.manager [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:28:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:13 compute-1 kernel: tap03023543-2a (unregistering): left promiscuous mode
Sep 30 18:28:13 compute-1 NetworkManager[45549]: <info>  [1759256893.9831] device (tap03023543-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:28:13 compute-1 ovn_controller[135204]: 2025-09-30T18:28:13Z|00166|binding|INFO|Releasing lport 03023543-2a7a-4ad4-b500-cf06571e617c from this chassis (sb_readonly=0)
Sep 30 18:28:13 compute-1 ovn_controller[135204]: 2025-09-30T18:28:13Z|00167|binding|INFO|Setting lport 03023543-2a7a-4ad4-b500-cf06571e617c down in Southbound
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:13 compute-1 ovn_controller[135204]: 2025-09-30T18:28:13Z|00168|binding|INFO|Removing iface tap03023543-2a ovn-installed in OVS
Sep 30 18:28:13 compute-1 nova_compute[238822]: 2025-09-30 18:28:13.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.003 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e1:4f 10.100.0.12'], port_security=['fa:16:3e:45:e1:4f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '277581fe-2194-4331-bf7b-c2604b65125e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '5', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=03023543-2a7a-4ad4-b500-cf06571e617c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.005 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 03023543-2a7a-4ad4-b500-cf06571e617c in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.007 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.031 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3d0f50-693c-40f9-af59-6ec3906002c6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Sep 30 18:28:14 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 15.352s CPU time.
Sep 30 18:28:14 compute-1 systemd-machined[195911]: Machine qemu-13-instance-00000013 terminated.
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.081 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed918d1-adbf-4732-87c7-ed3f965b837a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.083 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[db16ad11-b6a2-4075-93ab-e31ff0aba2da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.124 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4bd69e-a0aa-496b-b51f-cbe2b37f5baf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.143 2 INFO nova.virt.libvirt.driver [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Instance destroyed successfully.
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.143 2 DEBUG nova.objects.instance [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 277581fe-2194-4331-bf7b-c2604b65125e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.151 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[17904420-9197-4199-8253-f9e6fbfa4b59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1468376, 'reachable_time': 19099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285232, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.176 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[65d3e9a8-fd1d-4ff8-8787-721fc40545dc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1468386, 'tstamp': 1468386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285238, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1468388, 'tstamp': 1468388}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285238, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.179 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.187 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.188 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.188 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.188 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:28:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:14.190 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[405677d1-7a46-4867-98f5-b58857c09482]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:14 compute-1 ceph-mon[75484]: pgmap v1487: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:28:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:14.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.589 2 DEBUG nova.compute.manager [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.590 2 DEBUG oslo_concurrency.lockutils [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.590 2 DEBUG oslo_concurrency.lockutils [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.590 2 DEBUG oslo_concurrency.lockutils [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.591 2 DEBUG nova.compute.manager [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] No waiting events found dispatching network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.591 2 DEBUG nova.compute.manager [req-aa7ba006-b575-4125-a152-78e03274904b req-9c5fc14a-16a5-4801-b1f6-220c88f8b931 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.650 2 DEBUG nova.virt.libvirt.vif [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-322504074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-322504074',id=19,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:27:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-0a1uu87b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:27:24Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=277581fe-2194-4331-bf7b-c2604b65125e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.650 2 DEBUG nova.network.os_vif_util [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "03023543-2a7a-4ad4-b500-cf06571e617c", "address": "fa:16:3e:45:e1:4f", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03023543-2a", "ovs_interfaceid": "03023543-2a7a-4ad4-b500-cf06571e617c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.651 2 DEBUG nova.network.os_vif_util [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.652 2 DEBUG os_vif [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03023543-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.660 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7c0218ba-22bc-4d8e-b023-1182c7a5ed96) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:14 compute-1 nova_compute[238822]: 2025-09-30 18:28:14.666 2 INFO os_vif [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e1:4f,bridge_name='br-int',has_traffic_filtering=True,id=03023543-2a7a-4ad4-b500-cf06571e617c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03023543-2a')
Sep 30 18:28:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:14 compute-1 sudo[285259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:28:14 compute-1 sudo[285259]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:14 compute-1 sudo[285259]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:15.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:15 compute-1 sshd-session[285209]: Failed password for invalid user ftpclient from 216.10.242.161 port 55750 ssh2
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.406 2 INFO nova.virt.libvirt.driver [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Deleting instance files /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e_del
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.407 2 INFO nova.virt.libvirt.driver [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Deletion of /var/lib/nova/instances/277581fe-2194-4331-bf7b-c2604b65125e_del complete
Sep 30 18:28:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.932 2 INFO nova.compute.manager [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Took 2.03 seconds to destroy the instance on the hypervisor.
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.932 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.933 2 DEBUG nova.compute.manager [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.934 2 DEBUG nova.network.neutron [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:28:15 compute-1 nova_compute[238822]: 2025-09-30 18:28:15.934 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.939797) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256895940163, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 919, "num_deletes": 252, "total_data_size": 1848985, "memory_usage": 1869952, "flush_reason": "Manual Compaction"}
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256895950734, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 787897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41456, "largest_seqno": 42370, "table_properties": {"data_size": 784448, "index_size": 1229, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9568, "raw_average_key_size": 20, "raw_value_size": 776960, "raw_average_value_size": 1692, "num_data_blocks": 54, "num_entries": 459, "num_filter_entries": 459, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256827, "oldest_key_time": 1759256827, "file_creation_time": 1759256895, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 10764 microseconds, and 5764 cpu microseconds.
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.950802) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 787897 bytes OK
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.950836) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.953713) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.953741) EVENT_LOG_v1 {"time_micros": 1759256895953731, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.953769) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1844304, prev total WAL file size 1844304, number of live WAL files 2.
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.955081) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323532' seq:72057594037927935, type:22 .. '6D6772737461740031353035' seq:0, type:0; will stop at (end)
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(769KB)], [78(13MB)]
Sep 30 18:28:15 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256895955139, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 14614029, "oldest_snapshot_seqno": -1}
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6524 keys, 11169338 bytes, temperature: kUnknown
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256896039855, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 11169338, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11129825, "index_size": 22123, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 168790, "raw_average_key_size": 25, "raw_value_size": 11016508, "raw_average_value_size": 1688, "num_data_blocks": 880, "num_entries": 6524, "num_filter_entries": 6524, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759256895, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.040327) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 11169338 bytes
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.043603) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.1 rd, 131.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 13.2 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(32.7) write-amplify(14.2) OK, records in: 7014, records dropped: 490 output_compression: NoCompression
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.043834) EVENT_LOG_v1 {"time_micros": 1759256896043822, "job": 48, "event": "compaction_finished", "compaction_time_micros": 84937, "compaction_time_cpu_micros": 48550, "output_level": 6, "num_output_files": 1, "total_output_size": 11169338, "num_input_records": 7014, "num_output_records": 6524, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256896044592, "job": 48, "event": "table_file_deletion", "file_number": 80}
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759256896048017, "job": 48, "event": "table_file_deletion", "file_number": 78}
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:15.954949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.048258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.048269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.048274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.048278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:28:16.048283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:28:16 compute-1 ceph-mon[75484]: pgmap v1488: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 170 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:28:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:16.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:16 compute-1 sshd-session[285209]: Received disconnect from 216.10.242.161 port 55750:11: Bye Bye [preauth]
Sep 30 18:28:16 compute-1 sshd-session[285209]: Disconnected from invalid user ftpclient 216.10.242.161 port 55750 [preauth]
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.419 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:16 compute-1 podman[285286]: 2025-09-30 18:28:16.559933221 +0000 UTC m=+0.096672392 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.787 2 DEBUG nova.compute.manager [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.788 2 DEBUG oslo_concurrency.lockutils [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "277581fe-2194-4331-bf7b-c2604b65125e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.788 2 DEBUG oslo_concurrency.lockutils [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.789 2 DEBUG oslo_concurrency.lockutils [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.789 2 DEBUG nova.compute.manager [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] No waiting events found dispatching network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:28:16 compute-1 nova_compute[238822]: 2025-09-30 18:28:16.789 2 DEBUG nova.compute.manager [req-9c7406de-86ac-456e-b67d-5c03274afc7d req-31a488cd-d902-405e-bcad-0e501f5f4fae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-unplugged-03023543-2a7a-4ad4-b500-cf06571e617c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:28:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:17 compute-1 nova_compute[238822]: 2025-09-30 18:28:17.387 2 DEBUG nova.network.neutron [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:28:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:17 compute-1 nova_compute[238822]: 2025-09-30 18:28:17.898 2 INFO nova.compute.manager [-] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Took 1.96 seconds to deallocate network for instance.
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:18.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:18 compute-1 ceph-mon[75484]: pgmap v1489: 353 pgs: 353 active+clean; 200 MiB data, 336 MiB used, 40 GiB / 40 GiB avail; 85 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.431 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.432 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [WARNING] 272/182818 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Sep 30 18:28:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv[86466]: [ALERT] 272/182818 (4) : backend 'backend' has no server available!
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.491 2 DEBUG oslo_concurrency.processutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.848 2 DEBUG nova.compute.manager [req-62a3b694-0cd3-4e32-9c75-3fd0a787a502 req-b41b8043-fdf3-4693-897a-67e9d1a468b9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 277581fe-2194-4331-bf7b-c2604b65125e] Received event network-vif-deleted-03023543-2a7a-4ad4-b500-cf06571e617c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:28:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/41704762' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.956 2 DEBUG oslo_concurrency.processutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:18 compute-1 nova_compute[238822]: 2025-09-30 18:28:18.964 2 DEBUG nova.compute.provider_tree [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:28:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:19.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: ERROR   18:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: ERROR   18:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: ERROR   18:28:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: ERROR   18:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: ERROR   18:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:28:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:28:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/41704762' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:19 compute-1 nova_compute[238822]: 2025-09-30 18:28:19.505 2 DEBUG nova.scheduler.client.report [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:28:19 compute-1 nova_compute[238822]: 2025-09-30 18:28:19.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:20 compute-1 nova_compute[238822]: 2025-09-30 18:28:20.020 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.588s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:20 compute-1 nova_compute[238822]: 2025-09-30 18:28:20.044 2 INFO nova.scheduler.client.report [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 277581fe-2194-4331-bf7b-c2604b65125e
Sep 30 18:28:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:28:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:20.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:28:20 compute-1 ceph-mon[75484]: pgmap v1490: 353 pgs: 353 active+clean; 151 MiB data, 313 MiB used, 40 GiB / 40 GiB avail; 12 KiB/s rd, 3.5 KiB/s wr, 19 op/s
Sep 30 18:28:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.077 2 DEBUG oslo_concurrency.lockutils [None req-b9e26ae4-6733-4a2d-86ce-ccd6c82428b2 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "277581fe-2194-4331-bf7b-c2604b65125e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.710s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:21.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:21 compute-1 ceph-mon[75484]: pgmap v1491: 353 pgs: 353 active+clean; 121 MiB data, 293 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 2.5 KiB/s wr, 28 op/s
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.874 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "d188c0fb-8668-4ab2-b174-49e0e20505ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.875 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.876 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.876 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.877 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:21 compute-1 nova_compute[238822]: 2025-09-30 18:28:21.896 2 INFO nova.compute.manager [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Terminating instance
Sep 30 18:28:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:22.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.420 2 DEBUG nova.compute.manager [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:28:22 compute-1 kernel: tapac8e3da5-cb (unregistering): left promiscuous mode
Sep 30 18:28:22 compute-1 NetworkManager[45549]: <info>  [1759256902.4814] device (tapac8e3da5-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:28:22 compute-1 ovn_controller[135204]: 2025-09-30T18:28:22Z|00169|binding|INFO|Releasing lport ac8e3da5-cb09-4223-89d5-318d077ea35e from this chassis (sb_readonly=0)
Sep 30 18:28:22 compute-1 ovn_controller[135204]: 2025-09-30T18:28:22Z|00170|binding|INFO|Setting lport ac8e3da5-cb09-4223-89d5-318d077ea35e down in Southbound
Sep 30 18:28:22 compute-1 ovn_controller[135204]: 2025-09-30T18:28:22Z|00171|binding|INFO|Removing iface tapac8e3da5-cb ovn-installed in OVS
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:22.558 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:e0:b1 10.100.0.14'], port_security=['fa:16:3e:bb:e0:b1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd188c0fb-8668-4ab2-b174-49e0e20505ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '14', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=ac8e3da5-cb09-4223-89d5-318d077ea35e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:28:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:22.559 144543 INFO neutron.agent.ovn.metadata.agent [-] Port ac8e3da5-cb09-4223-89d5-318d077ea35e in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:28:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:22.561 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:28:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:22.562 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[16121f77-b9da-44b1-bb93-bc8792b9e7ea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:22.562 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:28:22 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Deactivated successfully.
Sep 30 18:28:22 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Consumed 3.085s CPU time.
Sep 30 18:28:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:28:22 compute-1 systemd-machined[195911]: Machine qemu-14-instance-00000012 terminated.
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.668 2 INFO nova.virt.libvirt.driver [-] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Instance destroyed successfully.
Sep 30 18:28:22 compute-1 nova_compute[238822]: 2025-09-30 18:28:22.669 2 DEBUG nova.objects.instance [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid d188c0fb-8668-4ab2-b174-49e0e20505ba obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:28:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [NOTICE]   (284694) : haproxy version is 3.0.5-8e879a5
Sep 30 18:28:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [NOTICE]   (284694) : path to executable is /usr/sbin/haproxy
Sep 30 18:28:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [WARNING]  (284694) : Exiting Master process...
Sep 30 18:28:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [ALERT]    (284694) : Current worker (284706) exited with code 143 (Terminated)
Sep 30 18:28:22 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[284644]: [WARNING]  (284694) : All workers exited. Exiting... (0)
Sep 30 18:28:22 compute-1 podman[285401]: 2025-09-30 18:28:22.73508035 +0000 UTC m=+0.035598072 container kill 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:28:22 compute-1 systemd[1]: libpod-53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b.scope: Deactivated successfully.
Sep 30 18:28:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:28:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3000.1 total, 600.0 interval
                                           Cumulative writes: 16K writes, 62K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.02 MB/s
                                           Cumulative WAL: 16K writes, 4761 syncs, 3.37 writes per sync, written: 0.05 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3022 writes, 11K keys, 3022 commit groups, 1.0 writes per commit group, ingest: 12.61 MB, 0.02 MB/s
                                           Interval WAL: 3022 writes, 1192 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:28:22 compute-1 podman[285355]: 2025-09-30 18:28:22.821756381 +0000 UTC m=+0.180989929 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, distribution-scope=public, release=1755695350)
Sep 30 18:28:22 compute-1 podman[285349]: 2025-09-30 18:28:22.823725334 +0000 UTC m=+0.189613262 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 18:28:22 compute-1 podman[285441]: 2025-09-30 18:28:22.828989217 +0000 UTC m=+0.069899089 container died 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:28:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:22 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b-userdata-shm.mount: Deactivated successfully.
Sep 30 18:28:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-b9c256e033802b2d778797931e048cc39c7717dac5c85132c237ba18ed41446e-merged.mount: Deactivated successfully.
Sep 30 18:28:23 compute-1 podman[285356]: 2025-09-30 18:28:23.083061219 +0000 UTC m=+0.423744056 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:28:23 compute-1 unix_chkpwd[285471]: password check failed for user (root)
Sep 30 18:28:23 compute-1 sshd-session[285334]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:28:23 compute-1 podman[285441]: 2025-09-30 18:28:23.162520725 +0000 UTC m=+0.403430537 container cleanup 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:28:23 compute-1 systemd[1]: libpod-conmon-53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b.scope: Deactivated successfully.
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.175 2 DEBUG nova.virt.libvirt.vif [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:26:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1528882058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1528882058',id=18,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:26:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-1gsulz3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:28:05Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=d188c0fb-8668-4ab2-b174-49e0e20505ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.176 2 DEBUG nova.network.os_vif_util [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "address": "fa:16:3e:bb:e0:b1", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac8e3da5-cb", "ovs_interfaceid": "ac8e3da5-cb09-4223-89d5-318d077ea35e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.177 2 DEBUG nova.network.os_vif_util [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.178 2 DEBUG os_vif [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac8e3da5-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=04d1d318-a1c8-4d38-a3f0-1d6f84c98c73) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:23.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.197 2 INFO os_vif [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:e0:b1,bridge_name='br-int',has_traffic_filtering=True,id=ac8e3da5-cb09-4223-89d5-318d077ea35e,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac8e3da5-cb')
Sep 30 18:28:23 compute-1 podman[285456]: 2025-09-30 18:28:23.290031479 +0000 UTC m=+0.444065485 container remove 53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.301 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b77961e4-1edf-4fe5-851d-a41440c7c818]: (4, ("Tue Sep 30 06:28:22 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b)\n53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b\nTue Sep 30 06:28:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b)\n53ee1a1c25629c7ac48a97c3b4bf3967f9d3da57bf27eaac099602490ee41b3b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.302 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e271216b-efb3-422f-82a1-4ceb5e848a58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.303 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.303 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd8ca42-f781-4815-b534-4f3c26591220]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.303 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.322 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9527983f-ddba-408f-b225-fd3a7ddd5701]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.349 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[90ef74f8-c4a4-49c3-b15e-e6808342ae2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.350 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ca25848a-ed3b-49cf-b6c2-1a3276b70bc8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.373 2 DEBUG nova.compute.manager [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Received event network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.374 2 DEBUG oslo_concurrency.lockutils [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.374 2 DEBUG oslo_concurrency.lockutils [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.374 2 DEBUG oslo_concurrency.lockutils [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.375 2 DEBUG nova.compute.manager [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] No waiting events found dispatching network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.375 2 DEBUG nova.compute.manager [req-c35fdbdd-bc43-4c9c-8e5d-926d7f06c6bf req-a00c86ef-8e84-4f69-8758-d9096d682c0c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Received event network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.382 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[693856be-eac0-45a9-9a60-b45bdd555dd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1468368, 'reachable_time': 27518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285492, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.392 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:28:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:23.393 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[f6037c63-ac7d-4208-9c75-0ce1a490a0bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:28:23 compute-1 ceph-mon[75484]: pgmap v1492: 353 pgs: 353 active+clean; 121 MiB data, 293 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:28:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.944 2 INFO nova.virt.libvirt.driver [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Deleting instance files /var/lib/nova/instances/d188c0fb-8668-4ab2-b174-49e0e20505ba_del
Sep 30 18:28:23 compute-1 nova_compute[238822]: 2025-09-30 18:28:23.946 2 INFO nova.virt.libvirt.driver [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Deletion of /var/lib/nova/instances/d188c0fb-8668-4ab2-b174-49e0e20505ba_del complete
Sep 30 18:28:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:24.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.461 2 INFO nova.compute.manager [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Took 2.04 seconds to destroy the instance on the hypervisor.
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.462 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.462 2 DEBUG nova.compute.manager [-] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.462 2 DEBUG nova.network.neutron [-] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.463 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:24 compute-1 nova_compute[238822]: 2025-09-30 18:28:24.596 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:25.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:25 compute-1 sshd-session[285494]: Invalid user foundry from 84.51.43.58 port 57450
Sep 30 18:28:25 compute-1 sshd-session[285494]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:28:25 compute-1 sshd-session[285494]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.443 2 DEBUG nova.compute.manager [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Received event network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.443 2 DEBUG oslo_concurrency.lockutils [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.443 2 DEBUG oslo_concurrency.lockutils [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.444 2 DEBUG oslo_concurrency.lockutils [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.444 2 DEBUG nova.compute.manager [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] No waiting events found dispatching network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.444 2 DEBUG nova.compute.manager [req-7c14a034-e4ac-4e96-b692-d252e0352dab req-5f147565-5fdd-4188-920b-2900c4ebe50b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Received event network-vif-unplugged-ac8e3da5-cb09-4223-89d5-318d077ea35e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:28:25 compute-1 ceph-mon[75484]: pgmap v1493: 353 pgs: 353 active+clean; 121 MiB data, 293 MiB used, 40 GiB / 40 GiB avail; 24 KiB/s rd, 1.2 KiB/s wr, 34 op/s
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.684 2 DEBUG nova.compute.manager [req-1493e7a9-d7cc-4fa3-af94-5a183f93e0aa req-e923719b-6f5e-4b72-b21a-ab08c86873d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Received event network-vif-deleted-ac8e3da5-cb09-4223-89d5-318d077ea35e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.685 2 INFO nova.compute.manager [req-1493e7a9-d7cc-4fa3-af94-5a183f93e0aa req-e923719b-6f5e-4b72-b21a-ab08c86873d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Neutron deleted interface ac8e3da5-cb09-4223-89d5-318d077ea35e; detaching it from the instance and deleting it from the info cache
Sep 30 18:28:25 compute-1 nova_compute[238822]: 2025-09-30 18:28:25.685 2 DEBUG nova.network.neutron [req-1493e7a9-d7cc-4fa3-af94-5a183f93e0aa req-e923719b-6f5e-4b72-b21a-ab08c86873d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:28:25 compute-1 sshd-session[285334]: Failed password for root from 192.210.160.141 port 34054 ssh2
Sep 30 18:28:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:26 compute-1 nova_compute[238822]: 2025-09-30 18:28:26.119 2 DEBUG nova.network.neutron [-] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:28:26 compute-1 nova_compute[238822]: 2025-09-30 18:28:26.194 2 DEBUG nova.compute.manager [req-1493e7a9-d7cc-4fa3-af94-5a183f93e0aa req-e923719b-6f5e-4b72-b21a-ab08c86873d0 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Detach interface failed, port_id=ac8e3da5-cb09-4223-89d5-318d077ea35e, reason: Instance d188c0fb-8668-4ab2-b174-49e0e20505ba could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:28:26 compute-1 sshd-session[285334]: Connection closed by authenticating user root 192.210.160.141 port 34054 [preauth]
Sep 30 18:28:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:26.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:26 compute-1 nova_compute[238822]: 2025-09-30 18:28:26.645 2 INFO nova.compute.manager [-] [instance: d188c0fb-8668-4ab2-b174-49e0e20505ba] Took 2.18 seconds to deallocate network for instance.
Sep 30 18:28:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:27 compute-1 nova_compute[238822]: 2025-09-30 18:28:27.171 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:27 compute-1 nova_compute[238822]: 2025-09-30 18:28:27.172 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:27 compute-1 nova_compute[238822]: 2025-09-30 18:28:27.179 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:27.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:27 compute-1 nova_compute[238822]: 2025-09-30 18:28:27.240 2 INFO nova.scheduler.client.report [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance d188c0fb-8668-4ab2-b174-49e0e20505ba
Sep 30 18:28:27 compute-1 sshd-session[285494]: Failed password for invalid user foundry from 84.51.43.58 port 57450 ssh2
Sep 30 18:28:27 compute-1 ceph-mon[75484]: pgmap v1494: 353 pgs: 353 active+clean; 121 MiB data, 293 MiB used, 40 GiB / 40 GiB avail; 24 KiB/s rd, 1.2 KiB/s wr, 34 op/s
Sep 30 18:28:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:28 compute-1 nova_compute[238822]: 2025-09-30 18:28:28.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:28 compute-1 nova_compute[238822]: 2025-09-30 18:28:28.279 2 DEBUG oslo_concurrency.lockutils [None req-358a65b3-71ed-45ed-afb9-9e61d86cf8ee 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "d188c0fb-8668-4ab2-b174-49e0e20505ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.404s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:28 compute-1 nova_compute[238822]: 2025-09-30 18:28:28.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:28.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:28 compute-1 sudo[285504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:28:28 compute-1 sudo[285504]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:28 compute-1 sudo[285504]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:29 compute-1 sudo[285529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:28:29 compute-1 sudo[285529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:29 compute-1 sshd-session[285494]: Received disconnect from 84.51.43.58 port 57450:11: Bye Bye [preauth]
Sep 30 18:28:29 compute-1 sshd-session[285494]: Disconnected from invalid user foundry 84.51.43.58 port 57450 [preauth]
Sep 30 18:28:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:29.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:29 compute-1 sudo[285529]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:29 compute-1 sudo[285575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:28:29 compute-1 sudo[285575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:29 compute-1 sudo[285575]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:29 compute-1 sudo[285600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:28:29 compute-1 sudo[285600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:30 compute-1 sudo[285600]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:30 compute-1 ceph-mon[75484]: pgmap v1495: 353 pgs: 353 active+clean; 84 MiB data, 271 MiB used, 40 GiB / 40 GiB avail; 28 KiB/s rd, 1.7 KiB/s wr, 41 op/s
Sep 30 18:28:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:31.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:28:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:28:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:32 compute-1 ceph-mon[75484]: pgmap v1496: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 26 KiB/s rd, 1.4 KiB/s wr, 37 op/s
Sep 30 18:28:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:32.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:33 compute-1 nova_compute[238822]: 2025-09-30 18:28:33.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:33.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:33 compute-1 nova_compute[238822]: 2025-09-30 18:28:33.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:34 compute-1 ceph-mon[75484]: pgmap v1497: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.4 KiB/s wr, 28 op/s
Sep 30 18:28:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:34.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:35 compute-1 sudo[285662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:28:35 compute-1 sudo[285662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:35 compute-1 sudo[285662]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:35.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1049937000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:35 compute-1 sudo[285687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:28:35 compute-1 sudo[285687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:35 compute-1 sudo[285687]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:35 compute-1 podman[249638]: time="2025-09-30T18:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:28:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:28:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8349 "" "Go-http-client/1.1"
Sep 30 18:28:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:36 compute-1 nova_compute[238822]: 2025-09-30 18:28:36.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:36 compute-1 nova_compute[238822]: 2025-09-30 18:28:36.059 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:28:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:36.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:36 compute-1 ceph-mon[75484]: pgmap v1498: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.4 KiB/s wr, 28 op/s
Sep 30 18:28:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:28:36 compute-1 unix_chkpwd[285715]: password check failed for user (root)
Sep 30 18:28:36 compute-1 sshd-session[285705]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170  user=root
Sep 30 18:28:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:37.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3565038591' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:28:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3565038591' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:28:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:28:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:38 compute-1 nova_compute[238822]: 2025-09-30 18:28:38.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:38 compute-1 nova_compute[238822]: 2025-09-30 18:28:38.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:38 compute-1 nova_compute[238822]: 2025-09-30 18:28:38.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:38.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:38 compute-1 ceph-mon[75484]: pgmap v1499: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 14 KiB/s rd, 1.4 KiB/s wr, 21 op/s
Sep 30 18:28:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:39 compute-1 sshd-session[285705]: Failed password for root from 175.126.165.170 port 46880 ssh2
Sep 30 18:28:39 compute-1 sshd-session[285500]: ssh_dispatch_run_fatal: Connection from 110.42.70.108 port 40272: Connection timed out [preauth]
Sep 30 18:28:39 compute-1 ceph-mon[75484]: pgmap v1500: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 14 KiB/s rd, 1.4 KiB/s wr, 21 op/s
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.583 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.583 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.584 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.584 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:28:39 compute-1 nova_compute[238822]: 2025-09-30 18:28:39.584 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:39 compute-1 sshd-session[285705]: Received disconnect from 175.126.165.170 port 46880:11: Bye Bye [preauth]
Sep 30 18:28:39 compute-1 sshd-session[285705]: Disconnected from authenticating user root 175.126.165.170 port 46880 [preauth]
Sep 30 18:28:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:28:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/868335414' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.029 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.247 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.248 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.269 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.270 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4743MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.270 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:40 compute-1 nova_compute[238822]: 2025-09-30 18:28:40.271 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:40.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/868335414' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:41 compute-1 unix_chkpwd[285747]: password check failed for user (root)
Sep 30 18:28:41 compute-1 sshd-session[285744]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 18:28:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:41 compute-1 nova_compute[238822]: 2025-09-30 18:28:41.330 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:28:41 compute-1 nova_compute[238822]: 2025-09-30 18:28:41.330 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:28:40 up  4:06,  0 user,  load average: 0.44, 0.41, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:28:41 compute-1 nova_compute[238822]: 2025-09-30 18:28:41.355 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:41 compute-1 podman[285750]: 2025-09-30 18:28:41.566101456 +0000 UTC m=+0.098294968 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:28:41 compute-1 ceph-mon[75484]: pgmap v1501: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 10 KiB/s rd, 938 B/s wr, 15 op/s
Sep 30 18:28:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1228077198' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:28:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/743270422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:28:41 compute-1 podman[285749]: 2025-09-30 18:28:41.619928971 +0000 UTC m=+0.154047165 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:28:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:28:41 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/360614735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:41 compute-1 nova_compute[238822]: 2025-09-30 18:28:41.899 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:41 compute-1 nova_compute[238822]: 2025-09-30 18:28:41.905 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:28:42 compute-1 nova_compute[238822]: 2025-09-30 18:28:42.430 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:28:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:42.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/360614735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3165206529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:42 compute-1 nova_compute[238822]: 2025-09-30 18:28:42.944 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:28:42 compute-1 nova_compute[238822]: 2025-09-30 18:28:42.945 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.674s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:43 compute-1 nova_compute[238822]: 2025-09-30 18:28:43.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:43.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:43 compute-1 sshd-session[285744]: Failed password for root from 167.172.43.167 port 55392 ssh2
Sep 30 18:28:43 compute-1 nova_compute[238822]: 2025-09-30 18:28:43.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:43 compute-1 ceph-mon[75484]: pgmap v1502: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 40 GiB / 40 GiB avail; 597 B/s rd, 0 op/s
Sep 30 18:28:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:43 compute-1 sshd-session[285744]: Received disconnect from 167.172.43.167 port 55392:11: Bye Bye [preauth]
Sep 30 18:28:43 compute-1 sshd-session[285744]: Disconnected from authenticating user root 167.172.43.167 port 55392 [preauth]
Sep 30 18:28:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:44 compute-1 nova_compute[238822]: 2025-09-30 18:28:44.945 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:45 compute-1 nova_compute[238822]: 2025-09-30 18:28:45.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:45.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:45 compute-1 ceph-mon[75484]: pgmap v1503: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:28:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1401032979' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:46 compute-1 nova_compute[238822]: 2025-09-30 18:28:46.052 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:46.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:47.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:47 compute-1 podman[285827]: 2025-09-30 18:28:47.527981231 +0000 UTC m=+0.072225743 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:28:47 compute-1 ceph-mon[75484]: pgmap v1504: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:28:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:48 compute-1 nova_compute[238822]: 2025-09-30 18:28:48.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:48 compute-1 nova_compute[238822]: 2025-09-30 18:28:48.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:28:48 compute-1 nova_compute[238822]: 2025-09-30 18:28:48.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:48 compute-1 nova_compute[238822]: 2025-09-30 18:28:48.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:48 compute-1 unix_chkpwd[285848]: password check failed for user (root)
Sep 30 18:28:48 compute-1 sshd-session[285824]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:28:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:49.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: ERROR   18:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: ERROR   18:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: ERROR   18:28:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: ERROR   18:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: ERROR   18:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:28:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:28:49 compute-1 ceph-mon[75484]: pgmap v1505: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 655 KiB/s rd, 1.8 MiB/s wr, 51 op/s
Sep 30 18:28:49 compute-1 nova_compute[238822]: 2025-09-30 18:28:49.808 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:49 compute-1 nova_compute[238822]: 2025-09-30 18:28:49.808 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:50 compute-1 nova_compute[238822]: 2025-09-30 18:28:50.316 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:28:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:50.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:50 compute-1 nova_compute[238822]: 2025-09-30 18:28:50.890 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:50 compute-1 nova_compute[238822]: 2025-09-30 18:28:50.891 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:50 compute-1 nova_compute[238822]: 2025-09-30 18:28:50.900 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:28:50 compute-1 nova_compute[238822]: 2025-09-30 18:28:50.900 2 INFO nova.compute.claims [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:28:50 compute-1 sshd-session[285824]: Failed password for root from 192.210.160.141 port 48718 ssh2
Sep 30 18:28:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:51.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:51 compute-1 sshd-session[285824]: Connection closed by authenticating user root 192.210.160.141 port 48718 [preauth]
Sep 30 18:28:51 compute-1 ceph-mon[75484]: pgmap v1506: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:28:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:52 compute-1 nova_compute[238822]: 2025-09-30 18:28:52.153 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:52.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:28:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/509009086' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:52 compute-1 nova_compute[238822]: 2025-09-30 18:28:52.569 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:52 compute-1 nova_compute[238822]: 2025-09-30 18:28:52.576 2 DEBUG nova.compute.provider_tree [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:28:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:28:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/509009086' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:28:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:53 compute-1 nova_compute[238822]: 2025-09-30 18:28:53.087 2 DEBUG nova.scheduler.client.report [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:28:53 compute-1 nova_compute[238822]: 2025-09-30 18:28:53.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:53.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:53 compute-1 nova_compute[238822]: 2025-09-30 18:28:53.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:53 compute-1 sshd-session[285854]: Invalid user flutter from 14.225.167.110 port 60420
Sep 30 18:28:53 compute-1 sshd-session[285854]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:28:53 compute-1 sshd-session[285854]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:28:53 compute-1 podman[285878]: 2025-09-30 18:28:53.558353026 +0000 UTC m=+0.098192615 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:28:53 compute-1 podman[285880]: 2025-09-30 18:28:53.565448698 +0000 UTC m=+0.092241274 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:28:53 compute-1 podman[285879]: 2025-09-30 18:28:53.574659927 +0000 UTC m=+0.105547944 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:28:53 compute-1 nova_compute[238822]: 2025-09-30 18:28:53.605 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.714s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:53 compute-1 nova_compute[238822]: 2025-09-30 18:28:53.606 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:28:53 compute-1 ceph-mon[75484]: pgmap v1507: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:28:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.129 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.130 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.131 2 WARNING neutronclient.v2_0.client [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.131 2 WARNING neutronclient.v2_0.client [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:54.388 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:54.388 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:28:54.388 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:54.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.644 2 INFO nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:28:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:54 compute-1 nova_compute[238822]: 2025-09-30 18:28:54.916 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Successfully created port: 23c72539-ad46-4b4c-9724-c1e61705efc1 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.154 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:28:55 compute-1 sudo[285941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:28:55 compute-1 sudo[285941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:28:55 compute-1 sudo[285941]: pam_unix(sudo:session): session closed for user root
Sep 30 18:28:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:55.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.763 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Successfully updated port: 23c72539-ad46-4b4c-9724-c1e61705efc1 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:28:55 compute-1 ceph-mon[75484]: pgmap v1508: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.844 2 DEBUG nova.compute.manager [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-changed-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.845 2 DEBUG nova.compute.manager [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Refreshing instance network info cache due to event network-changed-23c72539-ad46-4b4c-9724-c1e61705efc1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.845 2 DEBUG oslo_concurrency.lockutils [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.846 2 DEBUG oslo_concurrency.lockutils [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:28:55 compute-1 nova_compute[238822]: 2025-09-30 18:28:55.846 2 DEBUG nova.network.neutron [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Refreshing network info cache for port 23c72539-ad46-4b4c-9724-c1e61705efc1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:28:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:56 compute-1 sshd-session[285854]: Failed password for invalid user flutter from 14.225.167.110 port 60420 ssh2
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.180 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.182 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.182 2 INFO nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Creating image(s)
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.227 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.269 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.315 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.323 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.337 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.358 2 WARNING neutronclient.v2_0.client [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.416 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.416 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.417 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.418 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:28:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:56.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.460 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:28:56 compute-1 nova_compute[238822]: 2025-09-30 18:28:56.465 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:28:56 compute-1 sshd-session[285854]: Received disconnect from 14.225.167.110 port 60420:11: Bye Bye [preauth]
Sep 30 18:28:56 compute-1 sshd-session[285854]: Disconnected from invalid user flutter 14.225.167.110 port 60420 [preauth]
Sep 30 18:28:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.040 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.128 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] resizing rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:28:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:57.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.295 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.296 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Ensure instance console log exists: /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.296 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.297 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.297 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:28:57 compute-1 nova_compute[238822]: 2025-09-30 18:28:57.447 2 DEBUG nova.network.neutron [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:28:57 compute-1 ceph-mon[75484]: pgmap v1509: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:28:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/554884993' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:28:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/554884993' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:28:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:58 compute-1 nova_compute[238822]: 2025-09-30 18:28:58.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:58 compute-1 nova_compute[238822]: 2025-09-30 18:28:58.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:28:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:28:58.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:58 compute-1 nova_compute[238822]: 2025-09-30 18:28:58.496 2 DEBUG nova.network.neutron [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:28:58 compute-1 unix_chkpwd[286137]: password check failed for user (root)
Sep 30 18:28:58 compute-1 sshd-session[286135]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:28:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:28:59 compute-1 nova_compute[238822]: 2025-09-30 18:28:59.039 2 DEBUG oslo_concurrency.lockutils [req-988d615a-c30f-4b31-af93-1b2ed775b6ca req-a5976f4b-8d86-4659-9b65-531ba272fbf7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:28:59 compute-1 nova_compute[238822]: 2025-09-30 18:28:59.040 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquired lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:28:59 compute-1 nova_compute[238822]: 2025-09-30 18:28:59.040 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:28:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:28:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:28:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:28:59.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:28:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:28:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:28:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:28:59 compute-1 ceph-mon[75484]: pgmap v1510: 353 pgs: 353 active+clean; 133 MiB data, 288 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 1.4 MiB/s wr, 113 op/s
Sep 30 18:29:00 compute-1 nova_compute[238822]: 2025-09-30 18:29:00.437 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:29:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:00.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:01 compute-1 sshd-session[286135]: Failed password for root from 8.243.64.201 port 41710 ssh2
Sep 30 18:29:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:01.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:01 compute-1 nova_compute[238822]: 2025-09-30 18:29:01.421 2 WARNING neutronclient.v2_0.client [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:01 compute-1 sshd-session[286135]: Received disconnect from 8.243.64.201 port 41710:11: Bye Bye [preauth]
Sep 30 18:29:01 compute-1 sshd-session[286135]: Disconnected from authenticating user root 8.243.64.201 port 41710 [preauth]
Sep 30 18:29:01 compute-1 nova_compute[238822]: 2025-09-30 18:29:01.643 2 DEBUG nova.network.neutron [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Updating instance_info_cache with network_info: [{"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:29:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:02 compute-1 ceph-mon[75484]: pgmap v1511: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 3.9 MiB/s wr, 135 op/s
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.155 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Releasing lock "refresh_cache-01bf4ef9-56ec-4065-aa2b-416af7c5f636" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.156 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance network_info: |[{"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.160 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Start _get_guest_xml network_info=[{"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.167 2 WARNING nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.169 2 DEBUG nova.virt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1703329817', uuid='01bf4ef9-56ec-4065-aa2b-416af7c5f636'), owner=OwnerMeta(userid='623ef4a55c9e4fc28bb65e49246b5008', username='tempest-TestExecuteStrategies-1883747907-project-admin', projectid='c634e1c17ed54907969576a0eb8eff50', projectname='tempest-TestExecuteStrategies-1883747907'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759256942.169375) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.175 2 DEBUG nova.virt.libvirt.host [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.176 2 DEBUG nova.virt.libvirt.host [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.180 2 DEBUG nova.virt.libvirt.host [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.181 2 DEBUG nova.virt.libvirt.host [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.182 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.182 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.183 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.183 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.184 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.184 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.185 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.185 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.186 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.186 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.186 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.187 2 DEBUG nova.virt.hardware [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.192 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:02.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:29:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2129834140' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.651 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.694 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:29:02 compute-1 nova_compute[238822]: 2025-09-30 18:29:02.700 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2129834140' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:29:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:29:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1810602024' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.147 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.150 2 DEBUG nova.virt.libvirt.vif [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703329817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703329817',id=21,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-96hm8nqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:28:55Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=01bf4ef9-56ec-4065-aa2b-416af7c5f636,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.151 2 DEBUG nova.network.os_vif_util [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.152 2 DEBUG nova.network.os_vif_util [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.154 2 DEBUG nova.objects.instance [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 01bf4ef9-56ec-4065-aa2b-416af7c5f636 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:03.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.664 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <uuid>01bf4ef9-56ec-4065-aa2b-416af7c5f636</uuid>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <name>instance-00000015</name>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-1703329817</nova:name>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:29:02</nova:creationTime>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:29:03 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:29:03 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <nova:port uuid="23c72539-ad46-4b4c-9724-c1e61705efc1">
Sep 30 18:29:03 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <system>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="serial">01bf4ef9-56ec-4065-aa2b-416af7c5f636</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="uuid">01bf4ef9-56ec-4065-aa2b-416af7c5f636</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </system>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <os>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </os>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <features>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </features>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </source>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </source>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:29:03 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:d9:52:b2"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <target dev="tap23c72539-ad"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/console.log" append="off"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <video>
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </video>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:29:03 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:29:03 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:29:03 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:29:03 compute-1 nova_compute[238822]: </domain>
Sep 30 18:29:03 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.665 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Preparing to wait for external event network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.665 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.665 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.666 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.667 2 DEBUG nova.virt.libvirt.vif [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703329817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703329817',id=21,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-96hm8nqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:28:55Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=01bf4ef9-56ec-4065-aa2b-416af7c5f636,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.667 2 DEBUG nova.network.os_vif_util [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.668 2 DEBUG nova.network.os_vif_util [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.669 2 DEBUG os_vif [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.670 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.672 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c898a1bc-ff44-59f9-8043-56350e4c2d74', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23c72539-ad, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.683 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap23c72539-ad, col_values=(('qos', UUID('53a8b1c7-30b0-4638-8dc4-20318a223e49')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.684 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap23c72539-ad, col_values=(('external_ids', {'iface-id': '23c72539-ad46-4b4c-9724-c1e61705efc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:52:b2', 'vm-uuid': '01bf4ef9-56ec-4065-aa2b-416af7c5f636'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 NetworkManager[45549]: <info>  [1759256943.6873] manager: (tap23c72539-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:03 compute-1 nova_compute[238822]: 2025-09-30 18:29:03.746 2 INFO os_vif [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad')
Sep 30 18:29:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:04 compute-1 ceph-mon[75484]: pgmap v1512: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 40 GiB / 40 GiB avail; 210 KiB/s rd, 3.9 MiB/s wr, 84 op/s
Sep 30 18:29:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1810602024' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:29:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:04.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:05.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.359 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.360 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.360 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No VIF found with MAC fa:16:3e:d9:52:b2, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.361 2 INFO nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Using config drive
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.401 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:29:05 compute-1 podman[249638]: time="2025-09-30T18:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:29:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:29:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8355 "" "Go-http-client/1.1"
Sep 30 18:29:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:05 compute-1 nova_compute[238822]: 2025-09-30 18:29:05.921 2 WARNING neutronclient.v2_0.client [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:06 compute-1 ceph-mon[75484]: pgmap v1513: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 40 GiB / 40 GiB avail; 211 KiB/s rd, 3.9 MiB/s wr, 85 op/s
Sep 30 18:29:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:06.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.561 2 INFO nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Creating config drive at /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.571 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp0wwpoj3k execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.719 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp0wwpoj3k" returned: 0 in 0.149s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.765 2 DEBUG nova.storage.rbd_utils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.770 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.992 2 DEBUG oslo_concurrency.processutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config 01bf4ef9-56ec-4065-aa2b-416af7c5f636_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.221s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:06 compute-1 nova_compute[238822]: 2025-09-30 18:29:06.993 2 INFO nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Deleting local config drive /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636/disk.config because it was imported into RBD.
Sep 30 18:29:07 compute-1 kernel: tap23c72539-ad: entered promiscuous mode
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.0832] manager: (tap23c72539-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Sep 30 18:29:07 compute-1 ovn_controller[135204]: 2025-09-30T18:29:07Z|00172|binding|INFO|Claiming lport 23c72539-ad46-4b4c-9724-c1e61705efc1 for this chassis.
Sep 30 18:29:07 compute-1 ovn_controller[135204]: 2025-09-30T18:29:07Z|00173|binding|INFO|23c72539-ad46-4b4c-9724-c1e61705efc1: Claiming fa:16:3e:d9:52:b2 10.100.0.6
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.097 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:52:b2 10.100.0.6'], port_security=['fa:16:3e:d9:52:b2 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01bf4ef9-56ec-4065-aa2b-416af7c5f636', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=23c72539-ad46-4b4c-9724-c1e61705efc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.098 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 23c72539-ad46-4b4c-9724-c1e61705efc1 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 bound to our chassis
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.100 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.118 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[538eec0c-e0c9-436a-b3df-a95bdcf86e37]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.119 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:29:07 compute-1 ovn_controller[135204]: 2025-09-30T18:29:07Z|00174|binding|INFO|Setting lport 23c72539-ad46-4b4c-9724-c1e61705efc1 ovn-installed in OVS
Sep 30 18:29:07 compute-1 ovn_controller[135204]: 2025-09-30T18:29:07Z|00175|binding|INFO|Setting lport 23c72539-ad46-4b4c-9724-c1e61705efc1 up in Southbound
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.124 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.125 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[482bda21-0d68-4d61-b036-df0ef50c9490]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.125 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[56476a9a-028a-4b8d-9686-656964a32023]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.137 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[a64b6369-5c80-456d-9145-c208e20e37e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 systemd-udevd[286285]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.144 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c3aa0439-2f9c-4aa8-9f1e-e58dae39f776]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 systemd-machined[195911]: New machine qemu-15-instance-00000015.
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.1619] device (tap23c72539-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:29:07 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.1640] device (tap23c72539-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.179 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[e29e1fa6-5039-49db-bccf-f9234ea4b55b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.186 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6328d1-bb34-4f40-8a2c-c484a62dc931]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.1885] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.231 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd472fb-f5e3-49ec-b483-4000eb68a046]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.235 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[24f7e461-e59c-44c3-a679-113227963779]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:07.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.2793] device (tap6901f664-30): carrier: link connected
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.292 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d6fa66d3-2d0e-42b4-ab9d-710e8e24c7d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.318 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6333fc8e-e7a0-4986-a255-734f35a002b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1478885, 'reachable_time': 32666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286317, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.336 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1282b0-4d60-457a-a185-3751ee9249ab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1478885, 'tstamp': 1478885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286319, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.351 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5206aeeb-eae7-4d51-b364-0c659a5eca38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1478885, 'reachable_time': 32666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286320, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.382 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcd736b-0630-46fc-ac53-4624520d0a41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.442 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a07f2188-a9be-4464-98ae-d9d281e98dd8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.443 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.443 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.444 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:29:07 compute-1 NetworkManager[45549]: <info>  [1759256947.4473] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.450 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_controller[135204]: 2025-09-30T18:29:07Z|00176|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.453 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c91f639c-c796-41fa-b73d-474d9fcea024]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[536efb4a-24ca-4a5b-8179-66eac872fa68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.454 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.455 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[14d4f540-dfa9-498b-a052-599f511032bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.455 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:29:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:07.456 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.517 2 DEBUG nova.compute.manager [req-d51b8ff8-7056-472b-970d-6a205ee9d573 req-ec19fc74-433e-4baf-8c86-f36517a82205 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.518 2 DEBUG oslo_concurrency.lockutils [req-d51b8ff8-7056-472b-970d-6a205ee9d573 req-ec19fc74-433e-4baf-8c86-f36517a82205 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.518 2 DEBUG oslo_concurrency.lockutils [req-d51b8ff8-7056-472b-970d-6a205ee9d573 req-ec19fc74-433e-4baf-8c86-f36517a82205 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.518 2 DEBUG oslo_concurrency.lockutils [req-d51b8ff8-7056-472b-970d-6a205ee9d573 req-ec19fc74-433e-4baf-8c86-f36517a82205 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:07 compute-1 nova_compute[238822]: 2025-09-30 18:29:07.519 2 DEBUG nova.compute.manager [req-d51b8ff8-7056-472b-970d-6a205ee9d573 req-ec19fc74-433e-4baf-8c86-f36517a82205 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Processing event network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:29:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:07 compute-1 podman[286396]: 2025-09-30 18:29:07.94404805 +0000 UTC m=+0.083489458 container create 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 18:29:07 compute-1 podman[286396]: 2025-09-30 18:29:07.902726823 +0000 UTC m=+0.042168271 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:29:08 compute-1 systemd[1]: Started libpod-conmon-022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe.scope.
Sep 30 18:29:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:29:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/750b3dd671d1e6dd98df0972e63df2f169433eacd5e8e2001b136b0fa5ba52ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:29:08 compute-1 ceph-mon[75484]: pgmap v1514: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 40 GiB / 40 GiB avail; 211 KiB/s rd, 3.9 MiB/s wr, 84 op/s
Sep 30 18:29:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:29:08 compute-1 podman[286396]: 2025-09-30 18:29:08.076653285 +0000 UTC m=+0.216094683 container init 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 18:29:08 compute-1 podman[286396]: 2025-09-30 18:29:08.087762935 +0000 UTC m=+0.227204303 container start 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:29:08 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [NOTICE]   (286415) : New worker (286418) forked
Sep 30 18:29:08 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [NOTICE]   (286415) : Loading success.
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.174 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.179 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.182 2 INFO nova.virt.libvirt.driver [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance spawned successfully.
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.183 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:29:08 compute-1 sshd-session[286258]: Invalid user colin from 103.153.190.105 port 52140
Sep 30 18:29:08 compute-1 sshd-session[286258]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:29:08 compute-1 sshd-session[286258]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:08.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.703 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.704 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.705 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.706 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.707 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 nova_compute[238822]: 2025-09-30 18:29:08.708 2 DEBUG nova.virt.libvirt.driver [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:29:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.223 2 INFO nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Took 13.04 seconds to spawn the instance on the hypervisor.
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.224 2 DEBUG nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:29:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.577 2 DEBUG nova.compute.manager [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.577 2 DEBUG oslo_concurrency.lockutils [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.577 2 DEBUG oslo_concurrency.lockutils [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.578 2 DEBUG oslo_concurrency.lockutils [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.578 2 DEBUG nova.compute.manager [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] No waiting events found dispatching network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.578 2 WARNING nova.compute.manager [req-c1422b19-270b-4d2a-9183-8f5a8a90c4d5 req-74feb282-2e37-4c54-9d03-c54218526c39 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received unexpected event network-vif-plugged-23c72539-ad46-4b4c-9724-c1e61705efc1 for instance with vm_state active and task_state None.
Sep 30 18:29:09 compute-1 nova_compute[238822]: 2025-09-30 18:29:09.766 2 INFO nova.compute.manager [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Took 18.92 seconds to build instance.
Sep 30 18:29:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:10 compute-1 ceph-mon[75484]: pgmap v1515: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 40 GiB / 40 GiB avail; 213 KiB/s rd, 3.9 MiB/s wr, 88 op/s
Sep 30 18:29:10 compute-1 sshd-session[286258]: Failed password for invalid user colin from 103.153.190.105 port 52140 ssh2
Sep 30 18:29:10 compute-1 nova_compute[238822]: 2025-09-30 18:29:10.275 2 DEBUG oslo_concurrency.lockutils [None req-330134b2-8057-469c-b097-462a5d76f725 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.466s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:10.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:11.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:11 compute-1 sshd-session[286258]: Received disconnect from 103.153.190.105 port 52140:11: Bye Bye [preauth]
Sep 30 18:29:11 compute-1 sshd-session[286258]: Disconnected from invalid user colin 103.153.190.105 port 52140 [preauth]
Sep 30 18:29:12 compute-1 ceph-mon[75484]: pgmap v1516: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 127 KiB/s rd, 2.5 MiB/s wr, 56 op/s
Sep 30 18:29:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:12.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:12 compute-1 podman[286432]: 2025-09-30 18:29:12.578483443 +0000 UTC m=+0.119445980 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:29:12 compute-1 podman[286431]: 2025-09-30 18:29:12.596160571 +0000 UTC m=+0.139750449 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 18:29:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:13.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:13 compute-1 nova_compute[238822]: 2025-09-30 18:29:13.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:13 compute-1 nova_compute[238822]: 2025-09-30 18:29:13.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:14 compute-1 ceph-mon[75484]: pgmap v1517: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 8.6 KiB/s rd, 25 KiB/s wr, 11 op/s
Sep 30 18:29:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:14.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:14 compute-1 unix_chkpwd[286486]: password check failed for user (root)
Sep 30 18:29:14 compute-1 sshd-session[286482]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:29:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:15.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:15 compute-1 sudo[286487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:29:15 compute-1 sudo[286487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:15 compute-1 sudo[286487]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:16 compute-1 ceph-mon[75484]: pgmap v1518: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 75 op/s
Sep 30 18:29:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:16.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:17 compute-1 sshd-session[286482]: Failed password for root from 192.210.160.141 port 47808 ssh2
Sep 30 18:29:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:17.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:18 compute-1 ceph-mon[75484]: pgmap v1519: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 74 op/s
Sep 30 18:29:18 compute-1 sshd-session[286482]: Connection closed by authenticating user root 192.210.160.141 port 47808 [preauth]
Sep 30 18:29:18 compute-1 nova_compute[238822]: 2025-09-30 18:29:18.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:18.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:18 compute-1 sshd-session[286514]: Invalid user administrator from 216.10.242.161 port 52988
Sep 30 18:29:18 compute-1 podman[286517]: 2025-09-30 18:29:18.549114663 +0000 UTC m=+0.084932457 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:29:18 compute-1 sshd-session[286514]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:29:18 compute-1 sshd-session[286514]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:29:18 compute-1 nova_compute[238822]: 2025-09-30 18:29:18.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:19.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: ERROR   18:29:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: ERROR   18:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: ERROR   18:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: ERROR   18:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: ERROR   18:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:29:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:29:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:20 compute-1 ceph-mon[75484]: pgmap v1520: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 29 KiB/s wr, 75 op/s
Sep 30 18:29:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:20.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:20 compute-1 sshd-session[286514]: Failed password for invalid user administrator from 216.10.242.161 port 52988 ssh2
Sep 30 18:29:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:21 compute-1 ovn_controller[135204]: 2025-09-30T18:29:21Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:52:b2 10.100.0.6
Sep 30 18:29:21 compute-1 ovn_controller[135204]: 2025-09-30T18:29:21Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:52:b2 10.100.0.6
Sep 30 18:29:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:21.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:21 compute-1 nova_compute[238822]: 2025-09-30 18:29:21.494 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Creating tmpfile /var/lib/nova/instances/tmp45k9azpg to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:29:21 compute-1 nova_compute[238822]: 2025-09-30 18:29:21.497 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:21 compute-1 nova_compute[238822]: 2025-09-30 18:29:21.518 2 DEBUG nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp45k9azpg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:29:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:22 compute-1 ceph-mon[75484]: pgmap v1521: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 18 KiB/s wr, 72 op/s
Sep 30 18:29:22 compute-1 sshd-session[286514]: Received disconnect from 216.10.242.161 port 52988:11: Bye Bye [preauth]
Sep 30 18:29:22 compute-1 sshd-session[286514]: Disconnected from invalid user administrator 216.10.242.161 port 52988 [preauth]
Sep 30 18:29:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:22.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:29:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:23.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:23 compute-1 nova_compute[238822]: 2025-09-30 18:29:23.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:23 compute-1 nova_compute[238822]: 2025-09-30 18:29:23.574 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:23 compute-1 nova_compute[238822]: 2025-09-30 18:29:23.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:24 compute-1 ceph-mon[75484]: pgmap v1522: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 4.7 KiB/s wr, 65 op/s
Sep 30 18:29:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:24.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:24 compute-1 podman[286548]: 2025-09-30 18:29:24.564171265 +0000 UTC m=+0.098972176 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:29:24 compute-1 podman[286549]: 2025-09-30 18:29:24.579933912 +0000 UTC m=+0.107910788 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal)
Sep 30 18:29:24 compute-1 podman[286550]: 2025-09-30 18:29:24.598900574 +0000 UTC m=+0.122844571 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:29:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:25.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:26 compute-1 ceph-mon[75484]: pgmap v1523: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 129 op/s
Sep 30 18:29:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:26.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:27.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:28 compute-1 nova_compute[238822]: 2025-09-30 18:29:28.074 2 DEBUG nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp45k9azpg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='a39db459-001a-467e-8721-1dca3120f5ee',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:29:28 compute-1 ceph-mon[75484]: pgmap v1524: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:29:28 compute-1 nova_compute[238822]: 2025-09-30 18:29:28.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:28.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:28 compute-1 nova_compute[238822]: 2025-09-30 18:29:28.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:29 compute-1 nova_compute[238822]: 2025-09-30 18:29:29.092 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:29:29 compute-1 nova_compute[238822]: 2025-09-30 18:29:29.093 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:29:29 compute-1 nova_compute[238822]: 2025-09-30 18:29:29.093 2 DEBUG nova.network.neutron [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:29:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:29.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:29 compute-1 nova_compute[238822]: 2025-09-30 18:29:29.602 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:30 compute-1 ceph-mon[75484]: pgmap v1525: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:29:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:30.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:31.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:31 compute-1 nova_compute[238822]: 2025-09-30 18:29:31.430 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:31 compute-1 nova_compute[238822]: 2025-09-30 18:29:31.650 2 DEBUG nova.network.neutron [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Updating instance_info_cache with network_info: [{"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:29:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.158 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.181 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp45k9azpg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='a39db459-001a-467e-8721-1dca3120f5ee',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.182 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Creating instance directory: /var/lib/nova/instances/a39db459-001a-467e-8721-1dca3120f5ee pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.182 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Ensure instance console log exists: /var/lib/nova/instances/a39db459-001a-467e-8721-1dca3120f5ee/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.183 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.184 2 DEBUG nova.virt.libvirt.vif [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:28:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1579591444',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1579591444',id=20,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ad15oqdw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:28:46Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=a39db459-001a-467e-8721-1dca3120f5ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.184 2 DEBUG nova.network.os_vif_util [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.185 2 DEBUG nova.network.os_vif_util [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.186 2 DEBUG os_vif [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c20c8ca2-450c-5769-8c14-f3be44346c88', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4cd1879d-b7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.248 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4cd1879d-b7, col_values=(('qos', UUID('bdae8e1e-9f4c-4d4f-8d4e-5bd4507b8d81')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.248 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4cd1879d-b7, col_values=(('external_ids', {'iface-id': '4cd1879d-b7f9-410d-8517-ebcb79e59e3c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:9a:24', 'vm-uuid': 'a39db459-001a-467e-8721-1dca3120f5ee'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 NetworkManager[45549]: <info>  [1759256972.2515] manager: (tap4cd1879d-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.260 2 INFO os_vif [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7')
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.261 2 DEBUG nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.261 2 DEBUG nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp45k9azpg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='a39db459-001a-467e-8721-1dca3120f5ee',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.262 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:32 compute-1 ceph-mon[75484]: pgmap v1526: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.444 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:32.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:32 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:32.773 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:29:32 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:32.773 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:32 compute-1 nova_compute[238822]: 2025-09-30 18:29:32.857 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:33.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:33 compute-1 nova_compute[238822]: 2025-09-30 18:29:33.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:33 compute-1 nova_compute[238822]: 2025-09-30 18:29:33.488 2 DEBUG nova.network.neutron [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Port 4cd1879d-b7f9-410d-8517-ebcb79e59e3c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:29:33 compute-1 nova_compute[238822]: 2025-09-30 18:29:33.501 2 DEBUG nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp45k9azpg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='a39db459-001a-467e-8721-1dca3120f5ee',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:29:33 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:33.776 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:34 compute-1 ceph-mon[75484]: pgmap v1527: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:29:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:34.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:35.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:35 compute-1 sudo[286620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:29:35 compute-1 sudo[286620]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:35 compute-1 sudo[286620]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:35 compute-1 podman[249638]: time="2025-09-30T18:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:29:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:29:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8813 "" "Go-http-client/1.1"
Sep 30 18:29:35 compute-1 sudo[286645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:29:35 compute-1 sudo[286645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:35 compute-1 sudo[286645]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:35 compute-1 sudo[286670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:29:35 compute-1 sudo[286670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:36 compute-1 kernel: tap4cd1879d-b7: entered promiscuous mode
Sep 30 18:29:36 compute-1 NetworkManager[45549]: <info>  [1759256976.1478] manager: (tap4cd1879d-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Sep 30 18:29:36 compute-1 ovn_controller[135204]: 2025-09-30T18:29:36Z|00177|binding|INFO|Claiming lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c for this additional chassis.
Sep 30 18:29:36 compute-1 ovn_controller[135204]: 2025-09-30T18:29:36Z|00178|binding|INFO|4cd1879d-b7f9-410d-8517-ebcb79e59e3c: Claiming fa:16:3e:4e:9a:24 10.100.0.8
Sep 30 18:29:36 compute-1 nova_compute[238822]: 2025-09-30 18:29:36.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.158 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:9a:24 10.100.0.8'], port_security=['fa:16:3e:4e:9a:24 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a39db459-001a-467e-8721-1dca3120f5ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=4cd1879d-b7f9-410d-8517-ebcb79e59e3c) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.159 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 4cd1879d-b7f9-410d-8517-ebcb79e59e3c in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.160 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:29:36 compute-1 ovn_controller[135204]: 2025-09-30T18:29:36Z|00179|binding|INFO|Setting lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c ovn-installed in OVS
Sep 30 18:29:36 compute-1 nova_compute[238822]: 2025-09-30 18:29:36.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:36 compute-1 nova_compute[238822]: 2025-09-30 18:29:36.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.194 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e2360d9e-dd48-4641-83fd-605b603ceaa7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 systemd-machined[195911]: New machine qemu-16-instance-00000014.
Sep 30 18:29:36 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000014.
Sep 30 18:29:36 compute-1 systemd-udevd[286725]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.228 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d03509-596f-45ab-a524-08f0466296cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.231 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[9328150e-c5ad-4769-b087-e9d4a5fa2702]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 NetworkManager[45549]: <info>  [1759256976.2539] device (tap4cd1879d-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:29:36 compute-1 NetworkManager[45549]: <info>  [1759256976.2558] device (tap4cd1879d-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.273 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb70e75-9676-4ea7-9c4e-64235d9c91c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.298 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[76e9ee00-fa32-43dc-b729-48a225610192]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1478885, 'reachable_time': 32666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286738, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.323 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d1730767-5966-4d55-beae-04ccfa4330b8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1478898, 'tstamp': 1478898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286740, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1478901, 'tstamp': 1478901}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286740, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.325 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:36 compute-1 nova_compute[238822]: 2025-09-30 18:29:36.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:36 compute-1 nova_compute[238822]: 2025-09-30 18:29:36.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.329 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.329 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.330 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.330 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:36.332 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2f2d39-a882-4074-9225-ffa337126028]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:36 compute-1 ceph-mon[75484]: pgmap v1528: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:29:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:36.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:36 compute-1 sudo[286670]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:37 compute-1 nova_compute[238822]: 2025-09-30 18:29:37.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:37.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3567577078' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3567577078' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:29:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:38 compute-1 nova_compute[238822]: 2025-09-30 18:29:38.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:38 compute-1 ceph-mon[75484]: pgmap v1529: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:29:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:38.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:38 compute-1 nova_compute[238822]: 2025-09-30 18:29:38.568 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:38 compute-1 nova_compute[238822]: 2025-09-30 18:29:38.569 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:29:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:39 compute-1 ovn_controller[135204]: 2025-09-30T18:29:39Z|00180|binding|INFO|Claiming lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c for this chassis.
Sep 30 18:29:39 compute-1 ovn_controller[135204]: 2025-09-30T18:29:39Z|00181|binding|INFO|4cd1879d-b7f9-410d-8517-ebcb79e59e3c: Claiming fa:16:3e:4e:9a:24 10.100.0.8
Sep 30 18:29:39 compute-1 ovn_controller[135204]: 2025-09-30T18:29:39Z|00182|binding|INFO|Setting lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c up in Southbound
Sep 30 18:29:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:39.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.572 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:29:39 compute-1 nova_compute[238822]: 2025-09-30 18:29:39.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:29:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506951630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:40 compute-1 nova_compute[238822]: 2025-09-30 18:29:40.035 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:40 compute-1 nova_compute[238822]: 2025-09-30 18:29:40.283 2 INFO nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Post operation of migration started
Sep 30 18:29:40 compute-1 nova_compute[238822]: 2025-09-30 18:29:40.284 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:40 compute-1 nova_compute[238822]: 2025-09-30 18:29:40.378 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:40 compute-1 nova_compute[238822]: 2025-09-30 18:29:40.379 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:40 compute-1 ceph-mon[75484]: pgmap v1530: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 16 KiB/s wr, 2 op/s
Sep 30 18:29:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1506951630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:40.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.090 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.090 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.095 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.095 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:29:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:41.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.346 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.348 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.391 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.391 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4356MB free_disk=39.901123046875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.392 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.392 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.441 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.442 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.442 2 DEBUG nova.network.neutron [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:29:41 compute-1 sudo[286830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:29:41 compute-1 sudo[286830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:41 compute-1 sudo[286830]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:41 compute-1 sshd-session[286826]: Invalid user work from 84.51.43.58 port 44391
Sep 30 18:29:41 compute-1 sshd-session[286826]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:29:41 compute-1 sshd-session[286826]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58
Sep 30 18:29:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:41 compute-1 unix_chkpwd[286855]: password check failed for user (root)
Sep 30 18:29:41 compute-1 sshd-session[286801]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:29:41 compute-1 nova_compute[238822]: 2025-09-30 18:29:41.948 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:42 compute-1 nova_compute[238822]: 2025-09-30 18:29:42.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:42 compute-1 nova_compute[238822]: 2025-09-30 18:29:42.415 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance a39db459-001a-467e-8721-1dca3120f5ee refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:29:42 compute-1 ceph-mon[75484]: pgmap v1531: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 4.3 KiB/s wr, 6 op/s
Sep 30 18:29:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:29:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:29:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3269182293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:42.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:42 compute-1 nova_compute[238822]: 2025-09-30 18:29:42.927 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Updating resource usage from migration 091476e8-6365-4164-85ee-66b48227aab9
Sep 30 18:29:42 compute-1 nova_compute[238822]: 2025-09-30 18:29:42.927 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Starting to track incoming migration 091476e8-6365-4164-85ee-66b48227aab9 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:29:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:43.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.403 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.475 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 01bf4ef9-56ec-4065-aa2b-416af7c5f636 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.560 2 DEBUG nova.network.neutron [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Updating instance_info_cache with network_info: [{"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:29:43 compute-1 podman[286859]: 2025-09-30 18:29:43.595014405 +0000 UTC m=+0.117696043 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:29:43 compute-1 podman[286858]: 2025-09-30 18:29:43.622159319 +0000 UTC m=+0.144718313 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 18:29:43 compute-1 sshd-session[286826]: Failed password for invalid user work from 84.51.43.58 port 44391 ssh2
Sep 30 18:29:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.983 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance a39db459-001a-467e-8721-1dca3120f5ee has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.984 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:29:43 compute-1 nova_compute[238822]: 2025-09-30 18:29:43.984 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:29:41 up  4:07,  0 user,  load average: 0.64, 0.47, 0.71\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:29:44 compute-1 sshd-session[286801]: Failed password for root from 192.210.160.141 port 56086 ssh2
Sep 30 18:29:44 compute-1 nova_compute[238822]: 2025-09-30 18:29:44.028 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:44 compute-1 nova_compute[238822]: 2025-09-30 18:29:44.069 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-a39db459-001a-467e-8721-1dca3120f5ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:29:44 compute-1 ceph-mon[75484]: pgmap v1532: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 3.3 KiB/s wr, 6 op/s
Sep 30 18:29:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3575853102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:29:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4220618568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:44.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:44 compute-1 nova_compute[238822]: 2025-09-30 18:29:44.514 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:44 compute-1 nova_compute[238822]: 2025-09-30 18:29:44.523 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:29:44 compute-1 nova_compute[238822]: 2025-09-30 18:29:44.598 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:44 compute-1 sshd-session[286801]: Connection closed by authenticating user root 192.210.160.141 port 56086 [preauth]
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.032 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:29:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:45.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4220618568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.546 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.547 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.547 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.949s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.548 2 DEBUG oslo_concurrency.lockutils [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:45 compute-1 nova_compute[238822]: 2025-09-30 18:29:45.554 2 INFO nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:29:45 compute-1 virtqemud[239124]: Domain id=16 name='instance-00000014' uuid=a39db459-001a-467e-8721-1dca3120f5ee is tainted: custom-monitor
Sep 30 18:29:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:46 compute-1 sshd-session[286826]: Received disconnect from 84.51.43.58 port 44391:11: Bye Bye [preauth]
Sep 30 18:29:46 compute-1 sshd-session[286826]: Disconnected from invalid user work 84.51.43.58 port 44391 [preauth]
Sep 30 18:29:46 compute-1 ceph-mon[75484]: pgmap v1533: 353 pgs: 353 active+clean; 200 MiB data, 362 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 4.3 KiB/s wr, 6 op/s
Sep 30 18:29:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:46.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:46 compute-1 nova_compute[238822]: 2025-09-30 18:29:46.548 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:46 compute-1 nova_compute[238822]: 2025-09-30 18:29:46.549 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:46 compute-1 nova_compute[238822]: 2025-09-30 18:29:46.564 2 INFO nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:29:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.071 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.072 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.072 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:47.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.571 2 INFO nova.virt.libvirt.driver [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:29:47 compute-1 nova_compute[238822]: 2025-09-30 18:29:47.578 2 DEBUG nova.compute.manager [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:29:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:48 compute-1 nova_compute[238822]: 2025-09-30 18:29:48.090 2 DEBUG nova.objects.instance [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:29:48 compute-1 nova_compute[238822]: 2025-09-30 18:29:48.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:48 compute-1 ceph-mon[75484]: pgmap v1534: 353 pgs: 353 active+clean; 200 MiB data, 362 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 1023 B/s wr, 5 op/s
Sep 30 18:29:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:48.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:49 compute-1 nova_compute[238822]: 2025-09-30 18:29:49.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:49 compute-1 nova_compute[238822]: 2025-09-30 18:29:49.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:49 compute-1 nova_compute[238822]: 2025-09-30 18:29:49.113 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:49 compute-1 nova_compute[238822]: 2025-09-30 18:29:49.259 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:49 compute-1 nova_compute[238822]: 2025-09-30 18:29:49.260 2 WARNING neutronclient.v2_0.client [None req-932e932d-fdac-471f-b447-67e8c697892d 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:49.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: ERROR   18:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: ERROR   18:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: ERROR   18:29:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: ERROR   18:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: ERROR   18:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:29:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:29:49 compute-1 podman[286938]: 2025-09-30 18:29:49.54146387 +0000 UTC m=+0.077974599 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Sep 30 18:29:49 compute-1 sshd-session[286935]: Invalid user jayden from 175.126.165.170 port 45784
Sep 30 18:29:49 compute-1 sshd-session[286935]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:29:49 compute-1 sshd-session[286935]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=175.126.165.170
Sep 30 18:29:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:50 compute-1 ceph-mon[75484]: pgmap v1535: 353 pgs: 353 active+clean; 200 MiB data, 362 MiB used, 40 GiB / 40 GiB avail; 5.1 KiB/s rd, 1023 B/s wr, 6 op/s
Sep 30 18:29:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:50.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:51 compute-1 sshd-session[286935]: Failed password for invalid user jayden from 175.126.165.170 port 45784 ssh2
Sep 30 18:29:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:51.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:51 compute-1 ceph-mon[75484]: pgmap v1536: 353 pgs: 353 active+clean; 200 MiB data, 362 MiB used, 40 GiB / 40 GiB avail; 4.8 KiB/s rd, 9.1 KiB/s wr, 6 op/s
Sep 30 18:29:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2085539696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.527 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.528 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.529 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.529 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.530 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:51 compute-1 nova_compute[238822]: 2025-09-30 18:29:51.545 2 INFO nova.compute.manager [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Terminating instance
Sep 30 18:29:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.064 2 DEBUG nova.compute.manager [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:29:52 compute-1 kernel: tap23c72539-ad (unregistering): left promiscuous mode
Sep 30 18:29:52 compute-1 NetworkManager[45549]: <info>  [1759256992.1175] device (tap23c72539-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:29:52 compute-1 ovn_controller[135204]: 2025-09-30T18:29:52Z|00183|binding|INFO|Releasing lport 23c72539-ad46-4b4c-9724-c1e61705efc1 from this chassis (sb_readonly=0)
Sep 30 18:29:52 compute-1 ovn_controller[135204]: 2025-09-30T18:29:52Z|00184|binding|INFO|Setting lport 23c72539-ad46-4b4c-9724-c1e61705efc1 down in Southbound
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 ovn_controller[135204]: 2025-09-30T18:29:52Z|00185|binding|INFO|Removing iface tap23c72539-ad ovn-installed in OVS
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.140 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:52:b2 10.100.0.6'], port_security=['fa:16:3e:d9:52:b2 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01bf4ef9-56ec-4065-aa2b-416af7c5f636', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '5', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=23c72539-ad46-4b4c-9724-c1e61705efc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.141 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 23c72539-ad46-4b4c-9724-c1e61705efc1 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.141 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.155 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e9358e81-6121-441e-b4fc-0e500ae4da82]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Sep 30 18:29:52 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 14.837s CPU time.
Sep 30 18:29:52 compute-1 systemd-machined[195911]: Machine qemu-15-instance-00000015 terminated.
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.201 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ad54fa31-4645-40b1-bf7c-2d34d9654328]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.205 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[361ce116-e2e0-40b4-aa6e-a957980e5925]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.233 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[74f02eb6-ba82-4f7b-bcab-9d3980fdca38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.251 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[88a0cbd2-c3d9-4ada-97d7-620e16724a53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1478885, 'reachable_time': 32666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286972, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.269 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2010c5-c618-4b8d-abca-2deb96a68574]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1478898, 'tstamp': 1478898}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286973, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1478901, 'tstamp': 1478901}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286973, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.270 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.278 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.278 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.278 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.279 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:29:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:52.280 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba373a4-1399-4c1d-a0e4-f8c34e42a7a2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.281 2 DEBUG nova.compute.manager [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.282 2 DEBUG oslo_concurrency.lockutils [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.282 2 DEBUG oslo_concurrency.lockutils [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.282 2 DEBUG oslo_concurrency.lockutils [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.282 2 DEBUG nova.compute.manager [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] No waiting events found dispatching network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.282 2 DEBUG nova.compute.manager [req-c80b6cf4-d0da-4147-a6d7-a6af2104d055 req-16c06f65-806b-4901-bd41-001e0906381a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.295 2 INFO nova.virt.libvirt.driver [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Instance destroyed successfully.
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.295 2 DEBUG nova.objects.instance [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 01bf4ef9-56ec-4065-aa2b-416af7c5f636 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 sshd-session[286935]: Received disconnect from 175.126.165.170 port 45784:11: Bye Bye [preauth]
Sep 30 18:29:52 compute-1 sshd-session[286935]: Disconnected from invalid user jayden 175.126.165.170 port 45784 [preauth]
Sep 30 18:29:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:29:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:52.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:29:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.803 2 DEBUG nova.virt.libvirt.vif [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1703329817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1703329817',id=21,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:29:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-96hm8nqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:29:09Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=01bf4ef9-56ec-4065-aa2b-416af7c5f636,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.804 2 DEBUG nova.network.os_vif_util [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "23c72539-ad46-4b4c-9724-c1e61705efc1", "address": "fa:16:3e:d9:52:b2", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c72539-ad", "ovs_interfaceid": "23c72539-ad46-4b4c-9724-c1e61705efc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.805 2 DEBUG nova.network.os_vif_util [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.805 2 DEBUG os_vif [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.808 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23c72539-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.814 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=53a8b1c7-30b0-4638-8dc4-20318a223e49) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:52 compute-1 nova_compute[238822]: 2025-09-30 18:29:52.819 2 INFO os_vif [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:52:b2,bridge_name='br-int',has_traffic_filtering=True,id=23c72539-ad46-4b4c-9724-c1e61705efc1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23c72539-ad')
Sep 30 18:29:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.308 2 INFO nova.virt.libvirt.driver [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Deleting instance files /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636_del
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.309 2 INFO nova.virt.libvirt.driver [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Deletion of /var/lib/nova/instances/01bf4ef9-56ec-4065-aa2b-416af7c5f636_del complete
Sep 30 18:29:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:53.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:53 compute-1 ceph-mon[75484]: pgmap v1537: 353 pgs: 353 active+clean; 200 MiB data, 362 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:29:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3629675832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.824 2 INFO nova.compute.manager [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Took 1.76 seconds to destroy the instance on the hypervisor.
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.825 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.826 2 DEBUG nova.compute.manager [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.826 2 DEBUG nova.network.neutron [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.827 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:53 compute-1 nova_compute[238822]: 2025-09-30 18:29:53.944 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:29:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.318 2 DEBUG nova.compute.manager [req-7583b7a6-5280-4b38-b4e1-b63396e307ce req-4345a2ac-312e-4bc3-a65d-5c382e8918de 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-deleted-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.319 2 INFO nova.compute.manager [req-7583b7a6-5280-4b38-b4e1-b63396e307ce req-4345a2ac-312e-4bc3-a65d-5c382e8918de 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Neutron deleted interface 23c72539-ad46-4b4c-9724-c1e61705efc1; detaching it from the instance and deleting it from the info cache
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.320 2 DEBUG nova.network.neutron [req-7583b7a6-5280-4b38-b4e1-b63396e307ce req-4345a2ac-312e-4bc3-a65d-5c382e8918de 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.349 2 DEBUG nova.compute.manager [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.350 2 DEBUG oslo_concurrency.lockutils [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.350 2 DEBUG oslo_concurrency.lockutils [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.351 2 DEBUG oslo_concurrency.lockutils [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.351 2 DEBUG nova.compute.manager [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] No waiting events found dispatching network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.352 2 DEBUG nova.compute.manager [req-48133464-3cd0-44ab-866e-9e08cf3a9f16 req-13ebee57-f498-4670-bcd7-6519b9602b5d 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Received event network-vif-unplugged-23c72539-ad46-4b4c-9724-c1e61705efc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:29:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:54.390 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:54.390 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:54.390 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:54.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.577 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.577 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.724 2 DEBUG nova.network.neutron [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:29:54 compute-1 nova_compute[238822]: 2025-09-30 18:29:54.828 2 DEBUG nova.compute.manager [req-7583b7a6-5280-4b38-b4e1-b63396e307ce req-4345a2ac-312e-4bc3-a65d-5c382e8918de 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Detach interface failed, port_id=23c72539-ad46-4b4c-9724-c1e61705efc1, reason: Instance 01bf4ef9-56ec-4065-aa2b-416af7c5f636 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:29:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:55 compute-1 nova_compute[238822]: 2025-09-30 18:29:55.232 2 INFO nova.compute.manager [-] [instance: 01bf4ef9-56ec-4065-aa2b-416af7c5f636] Took 1.41 seconds to deallocate network for instance.
Sep 30 18:29:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:55.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:55 compute-1 sudo[287020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:29:55 compute-1 podman[287008]: 2025-09-30 18:29:55.560984871 +0000 UTC m=+0.099486680 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:29:55 compute-1 sudo[287020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:29:55 compute-1 podman[287010]: 2025-09-30 18:29:55.56573358 +0000 UTC m=+0.097331972 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:29:55 compute-1 sudo[287020]: pam_unix(sudo:session): session closed for user root
Sep 30 18:29:55 compute-1 podman[287009]: 2025-09-30 18:29:55.591242619 +0000 UTC m=+0.127585369 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, io.openshift.expose-services=)
Sep 30 18:29:55 compute-1 ceph-mon[75484]: pgmap v1538: 353 pgs: 353 active+clean; 121 MiB data, 328 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 30 op/s
Sep 30 18:29:55 compute-1 nova_compute[238822]: 2025-09-30 18:29:55.760 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:55 compute-1 nova_compute[238822]: 2025-09-30 18:29:55.761 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:55 compute-1 nova_compute[238822]: 2025-09-30 18:29:55.832 2 DEBUG oslo_concurrency.processutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:29:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:29:56 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/823296709' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:56 compute-1 nova_compute[238822]: 2025-09-30 18:29:56.274 2 DEBUG oslo_concurrency.processutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:29:56 compute-1 nova_compute[238822]: 2025-09-30 18:29:56.283 2 DEBUG nova.compute.provider_tree [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:29:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:56.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/823296709' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:29:56 compute-1 nova_compute[238822]: 2025-09-30 18:29:56.793 2 DEBUG nova.scheduler.client.report [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:29:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:57 compute-1 nova_compute[238822]: 2025-09-30 18:29:57.305 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.544s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:57 compute-1 nova_compute[238822]: 2025-09-30 18:29:57.333 2 INFO nova.scheduler.client.report [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 01bf4ef9-56ec-4065-aa2b-416af7c5f636
Sep 30 18:29:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:57.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:57 compute-1 ceph-mon[75484]: pgmap v1539: 353 pgs: 353 active+clean; 121 MiB data, 328 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:29:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3006712251' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:29:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3006712251' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:29:57 compute-1 nova_compute[238822]: 2025-09-30 18:29:57.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:58 compute-1 nova_compute[238822]: 2025-09-30 18:29:58.369 2 DEBUG oslo_concurrency.lockutils [None req-0ffc1a90-041b-49a9-9e21-3c3ef3496fab 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "01bf4ef9-56ec-4065-aa2b-416af7c5f636" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.840s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:58 compute-1 nova_compute[238822]: 2025-09-30 18:29:58.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:29:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:29:58.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:29:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.093 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "a39db459-001a-467e-8721-1dca3120f5ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.094 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.094 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "a39db459-001a-467e-8721-1dca3120f5ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.094 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.095 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.113 2 INFO nova.compute.manager [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Terminating instance
Sep 30 18:29:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:29:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:29:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:29:59.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.634 2 DEBUG nova.compute.manager [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:29:59 compute-1 ceph-mon[75484]: pgmap v1540: 353 pgs: 353 active+clean; 121 MiB data, 328 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:29:59 compute-1 kernel: tap4cd1879d-b7 (unregistering): left promiscuous mode
Sep 30 18:29:59 compute-1 NetworkManager[45549]: <info>  [1759256999.6979] device (tap4cd1879d-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:29:59 compute-1 ovn_controller[135204]: 2025-09-30T18:29:59Z|00186|binding|INFO|Releasing lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c from this chassis (sb_readonly=0)
Sep 30 18:29:59 compute-1 ovn_controller[135204]: 2025-09-30T18:29:59Z|00187|binding|INFO|Setting lport 4cd1879d-b7f9-410d-8517-ebcb79e59e3c down in Southbound
Sep 30 18:29:59 compute-1 ovn_controller[135204]: 2025-09-30T18:29:59Z|00188|binding|INFO|Removing iface tap4cd1879d-b7 ovn-installed in OVS
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:59.724 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:9a:24 10.100.0.8'], port_security=['fa:16:3e:4e:9a:24 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a39db459-001a-467e-8721-1dca3120f5ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '15', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=4cd1879d-b7f9-410d-8517-ebcb79e59e3c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:29:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:59.725 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 4cd1879d-b7f9-410d-8517-ebcb79e59e3c in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:29:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:59.727 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:29:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:59.728 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[856e2a49-e348-49c3-a5d5-55e2578585c7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:29:59 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:29:59.729 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:29:59 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Deactivated successfully.
Sep 30 18:29:59 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Consumed 2.861s CPU time.
Sep 30 18:29:59 compute-1 systemd-machined[195911]: Machine qemu-16-instance-00000014 terminated.
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.879 2 INFO nova.virt.libvirt.driver [-] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Instance destroyed successfully.
Sep 30 18:29:59 compute-1 nova_compute[238822]: 2025-09-30 18:29:59.879 2 DEBUG nova.objects.instance [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid a39db459-001a-467e-8721-1dca3120f5ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:29:59 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [NOTICE]   (286415) : haproxy version is 3.0.5-8e879a5
Sep 30 18:29:59 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [NOTICE]   (286415) : path to executable is /usr/sbin/haproxy
Sep 30 18:29:59 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [WARNING]  (286415) : Exiting Master process...
Sep 30 18:29:59 compute-1 podman[287144]: 2025-09-30 18:29:59.897238582 +0000 UTC m=+0.054500374 container kill 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 18:29:59 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [ALERT]    (286415) : Current worker (286418) exited with code 143 (Terminated)
Sep 30 18:29:59 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[286411]: [WARNING]  (286415) : All workers exited. Exiting... (0)
Sep 30 18:29:59 compute-1 systemd[1]: libpod-022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe.scope: Deactivated successfully.
Sep 30 18:29:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:29:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:29:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:29:59 compute-1 podman[287168]: 2025-09-30 18:29:59.967663386 +0000 UTC m=+0.042343746 container died 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:30:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-750b3dd671d1e6dd98df0972e63df2f169433eacd5e8e2001b136b0fa5ba52ad-merged.mount: Deactivated successfully.
Sep 30 18:30:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe-userdata-shm.mount: Deactivated successfully.
Sep 30 18:30:00 compute-1 podman[287168]: 2025-09-30 18:30:00.012723894 +0000 UTC m=+0.087404214 container cleanup 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 18:30:00 compute-1 systemd[1]: libpod-conmon-022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe.scope: Deactivated successfully.
Sep 30 18:30:00 compute-1 podman[287170]: 2025-09-30 18:30:00.041667586 +0000 UTC m=+0.102398719 container remove 022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.048 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4b53269d-7fc9-4e34-8739-d3a2d4b0c98e]: (4, ("Tue Sep 30 06:29:59 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe)\n022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe\nTue Sep 30 06:29:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe)\n022d74a6b17fb7fbcae59665c34af67174b048140af2f96c4b3c0d4fc9edf9fe\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.049 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f109a7-9d94-43b2-9622-f1a040f563ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.050 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.050 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[71895746-d05d-4a73-9362-b36a657bebee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.051 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.089 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b6833b-2dc5-40ad-939d-a17776068007]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.122 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[85e86a8b-13ee-4dc4-8f66-be73e4499d14]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.123 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ab374509-1d6d-42e5-a186-b02fa5e6cace]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.144 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f04684-f40b-42b1-8116-8313e4fc8df3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1478875, 'reachable_time': 32131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287203, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.147 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:30:00 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:00.148 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[4b057f2b-d26c-464b-9c37-d1454d68b1a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.391 2 DEBUG nova.virt.libvirt.vif [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:28:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1579591444',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1579591444',id=20,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ad15oqdw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:29:48Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=a39db459-001a-467e-8721-1dca3120f5ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.391 2 DEBUG nova.network.os_vif_util [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "address": "fa:16:3e:4e:9a:24", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd1879d-b7", "ovs_interfaceid": "4cd1879d-b7f9-410d-8517-ebcb79e59e3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.392 2 DEBUG nova.network.os_vif_util [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.393 2 DEBUG os_vif [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.395 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4cd1879d-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.401 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bdae8e1e-9f4c-4d4f-8d4e-5bd4507b8d81) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.406 2 INFO os_vif [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:9a:24,bridge_name='br-int',has_traffic_filtering=True,id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd1879d-b7')
Sep 30 18:30:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:00.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.552 2 DEBUG nova.compute.manager [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Received event network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.552 2 DEBUG oslo_concurrency.lockutils [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a39db459-001a-467e-8721-1dca3120f5ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.553 2 DEBUG oslo_concurrency.lockutils [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.553 2 DEBUG oslo_concurrency.lockutils [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.553 2 DEBUG nova.compute.manager [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] No waiting events found dispatching network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.554 2 DEBUG nova.compute.manager [req-f7a3336e-c959-454c-bf0b-f14ab02fbfc6 req-c62feeb3-97dc-45a6-8bea-7577eb5f2d2a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Received event network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:30:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 18:30:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.925 2 INFO nova.virt.libvirt.driver [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Deleting instance files /var/lib/nova/instances/a39db459-001a-467e-8721-1dca3120f5ee_del
Sep 30 18:30:00 compute-1 nova_compute[238822]: 2025-09-30 18:30:00.926 2 INFO nova.virt.libvirt.driver [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Deletion of /var/lib/nova/instances/a39db459-001a-467e-8721-1dca3120f5ee_del complete
Sep 30 18:30:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:01.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.440 2 INFO nova.compute.manager [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Took 1.81 seconds to destroy the instance on the hypervisor.
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.440 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.441 2 DEBUG nova.compute.manager [-] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.441 2 DEBUG nova.network.neutron [-] [instance: a39db459-001a-467e-8721-1dca3120f5ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.441 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.524 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:01 compute-1 ceph-mon[75484]: pgmap v1541: 353 pgs: 353 active+clean; 121 MiB data, 328 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.831 2 DEBUG nova.compute.manager [req-6d19c6c2-2d58-4f10-88ca-3350c422d4fc req-e57f3997-d5f0-41a2-8717-e8c1d57532f2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Received event network-vif-deleted-4cd1879d-b7f9-410d-8517-ebcb79e59e3c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.832 2 INFO nova.compute.manager [req-6d19c6c2-2d58-4f10-88ca-3350c422d4fc req-e57f3997-d5f0-41a2-8717-e8c1d57532f2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Neutron deleted interface 4cd1879d-b7f9-410d-8517-ebcb79e59e3c; detaching it from the instance and deleting it from the info cache
Sep 30 18:30:01 compute-1 nova_compute[238822]: 2025-09-30 18:30:01.832 2 DEBUG nova.network.neutron [req-6d19c6c2-2d58-4f10-88ca-3350c422d4fc req-e57f3997-d5f0-41a2-8717-e8c1d57532f2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:30:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.280 2 DEBUG nova.network.neutron [-] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.340 2 DEBUG nova.compute.manager [req-6d19c6c2-2d58-4f10-88ca-3350c422d4fc req-e57f3997-d5f0-41a2-8717-e8c1d57532f2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Detach interface failed, port_id=4cd1879d-b7f9-410d-8517-ebcb79e59e3c, reason: Instance a39db459-001a-467e-8721-1dca3120f5ee could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:30:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:02.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.618 2 DEBUG nova.compute.manager [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Received event network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.619 2 DEBUG oslo_concurrency.lockutils [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "a39db459-001a-467e-8721-1dca3120f5ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.619 2 DEBUG oslo_concurrency.lockutils [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.619 2 DEBUG oslo_concurrency.lockutils [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.620 2 DEBUG nova.compute.manager [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] No waiting events found dispatching network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.620 2 DEBUG nova.compute.manager [req-f262dacb-5ef2-4f98-b98b-2140bb953542 req-71227a7f-fb38-4dd0-8b1a-1edcaddc2eda 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Received event network-vif-unplugged-4cd1879d-b7f9-410d-8517-ebcb79e59e3c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.789 2 INFO nova.compute.manager [-] [instance: a39db459-001a-467e-8721-1dca3120f5ee] Took 1.35 seconds to deallocate network for instance.
Sep 30 18:30:02 compute-1 nova_compute[238822]: 2025-09-30 18:30:02.849 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.312 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.312 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.318 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.368 2 INFO nova.scheduler.client.report [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance a39db459-001a-467e-8721-1dca3120f5ee
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.374 2 WARNING nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.374 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Triggering sync for uuid a39db459-001a-467e-8721-1dca3120f5ee _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.375 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "a39db459-001a-467e-8721-1dca3120f5ee" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:03 compute-1 nova_compute[238822]: 2025-09-30 18:30:03.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:03 compute-1 ceph-mon[75484]: pgmap v1542: 353 pgs: 353 active+clean; 121 MiB data, 328 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:30:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:04 compute-1 nova_compute[238822]: 2025-09-30 18:30:04.402 2 DEBUG oslo_concurrency.lockutils [None req-dabb00f7-9577-4853-9f3a-21279448ec05 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "a39db459-001a-467e-8721-1dca3120f5ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.309s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:04 compute-1 nova_compute[238822]: 2025-09-30 18:30:04.405 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "a39db459-001a-467e-8721-1dca3120f5ee" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.029s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:04.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:04 compute-1 sshd-session[287226]: Invalid user seekcy from 14.225.167.110 port 37254
Sep 30 18:30:04 compute-1 sshd-session[287226]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:30:04 compute-1 sshd-session[287226]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:30:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:04 compute-1 nova_compute[238822]: 2025-09-30 18:30:04.915 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "a39db459-001a-467e-8721-1dca3120f5ee" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.510s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:05 compute-1 nova_compute[238822]: 2025-09-30 18:30:05.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:05 compute-1 nova_compute[238822]: 2025-09-30 18:30:05.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:30:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:05.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:05 compute-1 nova_compute[238822]: 2025-09-30 18:30:05.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:05 compute-1 nova_compute[238822]: 2025-09-30 18:30:05.564 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:30:05 compute-1 podman[249638]: time="2025-09-30T18:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:30:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:30:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8349 "" "Go-http-client/1.1"
Sep 30 18:30:05 compute-1 ceph-mon[75484]: pgmap v1543: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 4.7 KiB/s wr, 56 op/s
Sep 30 18:30:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:06.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:06 compute-1 sshd-session[287226]: Failed password for invalid user seekcy from 14.225.167.110 port 37254 ssh2
Sep 30 18:30:06 compute-1 sshd-session[287226]: Received disconnect from 14.225.167.110 port 37254:11: Bye Bye [preauth]
Sep 30 18:30:06 compute-1 sshd-session[287226]: Disconnected from invalid user seekcy 14.225.167.110 port 37254 [preauth]
Sep 30 18:30:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:07.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:07 compute-1 ceph-mon[75484]: pgmap v1544: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:30:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:30:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:08 compute-1 nova_compute[238822]: 2025-09-30 18:30:08.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:08 compute-1 sshd-session[287235]: Invalid user eder from 8.243.64.201 port 46324
Sep 30 18:30:08 compute-1 sshd-session[287235]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:30:08 compute-1 sshd-session[287235]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:30:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:08.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:08 compute-1 unix_chkpwd[287238]: password check failed for user (root)
Sep 30 18:30:08 compute-1 sshd-session[287233]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:30:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:09.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:09 compute-1 ceph-mon[75484]: pgmap v1545: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:30:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:10 compute-1 nova_compute[238822]: 2025-09-30 18:30:10.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:10.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:10 compute-1 sshd-session[287235]: Failed password for invalid user eder from 8.243.64.201 port 46324 ssh2
Sep 30 18:30:10 compute-1 sshd-session[287233]: Failed password for root from 192.210.160.141 port 52400 ssh2
Sep 30 18:30:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:11.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:11 compute-1 sshd-session[287235]: Received disconnect from 8.243.64.201 port 46324:11: Bye Bye [preauth]
Sep 30 18:30:11 compute-1 sshd-session[287235]: Disconnected from invalid user eder 8.243.64.201 port 46324 [preauth]
Sep 30 18:30:11 compute-1 ceph-mon[75484]: pgmap v1546: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:30:11 compute-1 sshd-session[287233]: Connection closed by authenticating user root 192.210.160.141 port 52400 [preauth]
Sep 30 18:30:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:12.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:13.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:13 compute-1 nova_compute[238822]: 2025-09-30 18:30:13.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:13 compute-1 ceph-mon[75484]: pgmap v1547: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:30:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1619267689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:14 compute-1 podman[287246]: 2025-09-30 18:30:14.308661076 +0000 UTC m=+0.084147236 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:30:14 compute-1 podman[287245]: 2025-09-30 18:30:14.356409596 +0000 UTC m=+0.137935459 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:30:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:14.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:15.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:15 compute-1 nova_compute[238822]: 2025-09-30 18:30:15.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:15 compute-1 sudo[287296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:30:15 compute-1 sudo[287296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:15 compute-1 sudo[287296]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:15 compute-1 ceph-mon[75484]: pgmap v1548: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 60 KiB/s rd, 1.2 KiB/s wr, 95 op/s
Sep 30 18:30:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:16.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:17.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:17 compute-1 ceph-mon[75484]: pgmap v1549: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 40 GiB / 40 GiB avail; 41 KiB/s rd, 0 B/s wr, 67 op/s
Sep 30 18:30:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:18 compute-1 nova_compute[238822]: 2025-09-30 18:30:18.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:18.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:19.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: ERROR   18:30:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: ERROR   18:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: ERROR   18:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: ERROR   18:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: ERROR   18:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:30:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:30:19 compute-1 ceph-mon[75484]: pgmap v1550: 353 pgs: 353 active+clean; 65 MiB data, 279 MiB used, 40 GiB / 40 GiB avail; 69 KiB/s rd, 901 KiB/s wr, 111 op/s
Sep 30 18:30:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:20 compute-1 sshd-session[287325]: Invalid user halo from 216.10.242.161 port 35980
Sep 30 18:30:20 compute-1 sshd-session[287325]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:30:20 compute-1 sshd-session[287325]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:30:20 compute-1 podman[287329]: 2025-09-30 18:30:20.243032297 +0000 UTC m=+0.083239712 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 18:30:20 compute-1 nova_compute[238822]: 2025-09-30 18:30:20.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:20.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:21.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:21 compute-1 ceph-mon[75484]: pgmap v1551: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 89 KiB/s rd, 1.8 MiB/s wr, 146 op/s
Sep 30 18:30:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/12074083' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:22 compute-1 sshd-session[287325]: Failed password for invalid user halo from 216.10.242.161 port 35980 ssh2
Sep 30 18:30:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:22.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:30:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3754482804' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:23 compute-1 nova_compute[238822]: 2025-09-30 18:30:23.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:23 compute-1 sshd-session[287325]: Received disconnect from 216.10.242.161 port 35980:11: Bye Bye [preauth]
Sep 30 18:30:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:23 compute-1 sshd-session[287325]: Disconnected from invalid user halo 216.10.242.161 port 35980 [preauth]
Sep 30 18:30:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:30:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:23.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:30:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:23 compute-1 ceph-mon[75484]: pgmap v1552: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 89 KiB/s rd, 1.8 MiB/s wr, 146 op/s
Sep 30 18:30:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:24.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:25 compute-1 sshd-session[287294]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:30:25 compute-1 sshd-session[287294]: banner exchange: Connection from 110.42.70.108 port 56324: Connection timed out
Sep 30 18:30:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:25.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:25 compute-1 nova_compute[238822]: 2025-09-30 18:30:25.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:25 compute-1 ceph-mon[75484]: pgmap v1553: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 89 KiB/s rd, 1.8 MiB/s wr, 146 op/s
Sep 30 18:30:26 compute-1 podman[287355]: 2025-09-30 18:30:26.558941388 +0000 UTC m=+0.093718124 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:30:26 compute-1 podman[287356]: 2025-09-30 18:30:26.557856019 +0000 UTC m=+0.086982492 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:30:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:30:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:26.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:30:26 compute-1 podman[287357]: 2025-09-30 18:30:26.590116061 +0000 UTC m=+0.113944251 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:30:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:27.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:27 compute-1 ceph-mon[75484]: pgmap v1554: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 40 GiB / 40 GiB avail; 49 KiB/s rd, 1.8 MiB/s wr, 79 op/s
Sep 30 18:30:28 compute-1 nova_compute[238822]: 2025-09-30 18:30:28.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:28.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:29.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:29 compute-1 ceph-mon[75484]: pgmap v1555: 353 pgs: 353 active+clean; 88 MiB data, 290 MiB used, 40 GiB / 40 GiB avail; 603 KiB/s rd, 1.8 MiB/s wr, 106 op/s
Sep 30 18:30:30 compute-1 nova_compute[238822]: 2025-09-30 18:30:30.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:30.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:31.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:32 compute-1 ceph-mon[75484]: pgmap v1556: 353 pgs: 353 active+clean; 88 MiB data, 290 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 927 KiB/s wr, 97 op/s
Sep 30 18:30:32 compute-1 nova_compute[238822]: 2025-09-30 18:30:32.152 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:32 compute-1 nova_compute[238822]: 2025-09-30 18:30:32.153 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:32.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:32 compute-1 nova_compute[238822]: 2025-09-30 18:30:32.660 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:30:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:33 compute-1 nova_compute[238822]: 2025-09-30 18:30:33.226 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:33 compute-1 nova_compute[238822]: 2025-09-30 18:30:33.227 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:33 compute-1 nova_compute[238822]: 2025-09-30 18:30:33.238 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:30:33 compute-1 nova_compute[238822]: 2025-09-30 18:30:33.238 2 INFO nova.compute.claims [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:30:33 compute-1 nova_compute[238822]: 2025-09-30 18:30:33.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:33.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:34 compute-1 ceph-mon[75484]: pgmap v1557: 353 pgs: 353 active+clean; 88 MiB data, 290 MiB used, 40 GiB / 40 GiB avail; 1.5 MiB/s rd, 13 KiB/s wr, 61 op/s
Sep 30 18:30:34 compute-1 nova_compute[238822]: 2025-09-30 18:30:34.294 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:34 compute-1 unix_chkpwd[287424]: password check failed for user (root)
Sep 30 18:30:34 compute-1 sshd-session[287419]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:30:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:34.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:30:34 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2703800171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:34 compute-1 nova_compute[238822]: 2025-09-30 18:30:34.794 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:34 compute-1 nova_compute[238822]: 2025-09-30 18:30:34.800 2 DEBUG nova.compute.provider_tree [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:30:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2703800171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:35 compute-1 nova_compute[238822]: 2025-09-30 18:30:35.326 2 DEBUG nova.scheduler.client.report [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:30:35 compute-1 nova_compute[238822]: 2025-09-30 18:30:35.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:35.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:35 compute-1 podman[249638]: time="2025-09-30T18:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:30:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:30:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8359 "" "Go-http-client/1.1"
Sep 30 18:30:35 compute-1 sudo[287447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:30:35 compute-1 sudo[287447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:35 compute-1 sudo[287447]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:35 compute-1 nova_compute[238822]: 2025-09-30 18:30:35.838 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.611s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:35 compute-1 nova_compute[238822]: 2025-09-30 18:30:35.839 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:30:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:36 compute-1 ceph-mon[75484]: pgmap v1558: 353 pgs: 353 active+clean; 88 MiB data, 290 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.069509) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036069606, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1680, "num_deletes": 251, "total_data_size": 3991288, "memory_usage": 4055400, "flush_reason": "Manual Compaction"}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036088210, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 2587406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42375, "largest_seqno": 44050, "table_properties": {"data_size": 2580464, "index_size": 3949, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15023, "raw_average_key_size": 20, "raw_value_size": 2566478, "raw_average_value_size": 3454, "num_data_blocks": 173, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759256896, "oldest_key_time": 1759256896, "file_creation_time": 1759257036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 18794 microseconds, and 7233 cpu microseconds.
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.088306) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 2587406 bytes OK
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.088337) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.090286) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.090310) EVENT_LOG_v1 {"time_micros": 1759257036090302, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.090330) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 3983549, prev total WAL file size 3983549, number of live WAL files 2.
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.091957) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(2526KB)], [81(10MB)]
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036092000, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 13756744, "oldest_snapshot_seqno": -1}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6749 keys, 11791340 bytes, temperature: kUnknown
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036157130, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11791340, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11749782, "index_size": 23590, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 174193, "raw_average_key_size": 25, "raw_value_size": 11632094, "raw_average_value_size": 1723, "num_data_blocks": 937, "num_entries": 6749, "num_filter_entries": 6749, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.157524) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11791340 bytes
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.159259) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.7 rd, 180.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 10.7 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(9.9) write-amplify(4.6) OK, records in: 7267, records dropped: 518 output_compression: NoCompression
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.159294) EVENT_LOG_v1 {"time_micros": 1759257036159275, "job": 50, "event": "compaction_finished", "compaction_time_micros": 65287, "compaction_time_cpu_micros": 47027, "output_level": 6, "num_output_files": 1, "total_output_size": 11791340, "num_input_records": 7267, "num_output_records": 6749, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036160446, "job": 50, "event": "table_file_deletion", "file_number": 83}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257036164428, "job": 50, "event": "table_file_deletion", "file_number": 81}
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.091863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.164591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.164599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.164603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.164606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:36.164609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:36 compute-1 nova_compute[238822]: 2025-09-30 18:30:36.352 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:30:36 compute-1 nova_compute[238822]: 2025-09-30 18:30:36.352 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:30:36 compute-1 nova_compute[238822]: 2025-09-30 18:30:36.353 2 WARNING neutronclient.v2_0.client [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:36 compute-1 nova_compute[238822]: 2025-09-30 18:30:36.354 2 WARNING neutronclient.v2_0.client [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:36 compute-1 sshd-session[287419]: Failed password for root from 192.210.160.141 port 43088 ssh2
Sep 30 18:30:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:36.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:36 compute-1 nova_compute[238822]: 2025-09-30 18:30:36.868 2 INFO nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:30:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3601109290' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:30:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3601109290' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:30:37 compute-1 nova_compute[238822]: 2025-09-30 18:30:37.377 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:30:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:37.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:37 compute-1 sshd-session[287419]: Connection closed by authenticating user root 192.210.160.141 port 43088 [preauth]
Sep 30 18:30:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:38 compute-1 ceph-mon[75484]: pgmap v1559: 353 pgs: 353 active+clean; 88 MiB data, 290 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:30:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.398 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.400 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.400 2 INFO nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Creating image(s)
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.439 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.479 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.516 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.521 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.565 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:30:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:38.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.602 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.603 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.604 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.605 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.638 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.643 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 5321575c-f6c1-4500-adf7-285c22df2e73_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.944 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 5321575c-f6c1-4500-adf7-285c22df2e73_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.301s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:38.949 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:30:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:38.950 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:30:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:38 compute-1 nova_compute[238822]: 2025-09-30 18:30:38.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.041 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] resizing rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.098 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Successfully created port: afaa4f9e-eab6-432e-9b39-d80bb074577d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.171 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.172 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Ensure instance console log exists: /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.173 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.174 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:39 compute-1 nova_compute[238822]: 2025-09-30 18:30:39.174 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:39.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:39.952 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:40 compute-1 ceph-mon[75484]: pgmap v1560: 353 pgs: 353 active+clean; 102 MiB data, 321 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.1 MiB/s wr, 102 op/s
Sep 30 18:30:40 compute-1 nova_compute[238822]: 2025-09-30 18:30:40.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:40.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.978230) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257040978335, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 303, "num_deletes": 255, "total_data_size": 124216, "memory_usage": 130080, "flush_reason": "Manual Compaction"}
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257040981330, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 81300, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44055, "largest_seqno": 44353, "table_properties": {"data_size": 79372, "index_size": 156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4880, "raw_average_key_size": 17, "raw_value_size": 75526, "raw_average_value_size": 271, "num_data_blocks": 7, "num_entries": 278, "num_filter_entries": 278, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257037, "oldest_key_time": 1759257037, "file_creation_time": 1759257040, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 3107 microseconds, and 1315 cpu microseconds.
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.981382) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 81300 bytes OK
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.981404) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.982781) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.982799) EVENT_LOG_v1 {"time_micros": 1759257040982793, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.982820) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 121984, prev total WAL file size 121984, number of live WAL files 2.
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.983243) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323532' seq:72057594037927935, type:22 .. '6C6F676D0031353033' seq:0, type:0; will stop at (end)
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(79KB)], [84(11MB)]
Sep 30 18:30:40 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257040983292, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11872640, "oldest_snapshot_seqno": -1}
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6510 keys, 11757714 bytes, temperature: kUnknown
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257041052192, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 11757714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11717177, "index_size": 23147, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 170171, "raw_average_key_size": 26, "raw_value_size": 11603083, "raw_average_value_size": 1782, "num_data_blocks": 915, "num_entries": 6510, "num_filter_entries": 6510, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257040, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.052526) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 11757714 bytes
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.053985) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.1 rd, 170.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.2 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(290.7) write-amplify(144.6) OK, records in: 7027, records dropped: 517 output_compression: NoCompression
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.054016) EVENT_LOG_v1 {"time_micros": 1759257041054003, "job": 52, "event": "compaction_finished", "compaction_time_micros": 68986, "compaction_time_cpu_micros": 48140, "output_level": 6, "num_output_files": 1, "total_output_size": 11757714, "num_input_records": 7027, "num_output_records": 6510, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257041054217, "job": 52, "event": "table_file_deletion", "file_number": 86}
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257041058352, "job": 52, "event": "table_file_deletion", "file_number": 84}
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:40.983149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.058408) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.058412) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.058414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.058416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:30:41.058418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:30:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:41.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.571 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:30:41 compute-1 nova_compute[238822]: 2025-09-30 18:30:41.571 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:41 compute-1 sudo[287646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:30:41 compute-1 sudo[287646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:41 compute-1 sudo[287646]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:41 compute-1 sudo[287690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:30:41 compute-1 sudo[287690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:30:42 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1136952851' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.118 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:42 compute-1 ceph-mon[75484]: pgmap v1561: 353 pgs: 353 active+clean; 113 MiB data, 349 MiB used, 40 GiB / 40 GiB avail; 1.7 MiB/s rd, 2.1 MiB/s wr, 98 op/s
Sep 30 18:30:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1136952851' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.322 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.325 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.357 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.358 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4748MB free_disk=39.947513580322266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.358 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.358 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.474 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Successfully updated port: afaa4f9e-eab6-432e-9b39-d80bb074577d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.541 2 DEBUG nova.compute.manager [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-changed-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.541 2 DEBUG nova.compute.manager [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Refreshing instance network info cache due to event network-changed-afaa4f9e-eab6-432e-9b39-d80bb074577d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.542 2 DEBUG oslo_concurrency.lockutils [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.542 2 DEBUG oslo_concurrency.lockutils [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:30:42 compute-1 nova_compute[238822]: 2025-09-30 18:30:42.543 2 DEBUG nova.network.neutron [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Refreshing network info cache for port afaa4f9e-eab6-432e-9b39-d80bb074577d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:30:42 compute-1 sudo[287690]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:42.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.050 2 WARNING neutronclient.v2_0.client [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.063 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:30:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:43.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.483 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 5321575c-f6c1-4500-adf7-285c22df2e73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.484 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.484 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:30:42 up  4:08,  0 user,  load average: 0.28, 0.40, 0.67\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.504 2 DEBUG nova.network.neutron [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.542 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.588 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.589 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.606 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.629 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.672 2 DEBUG nova.network.neutron [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:30:43 compute-1 nova_compute[238822]: 2025-09-30 18:30:43.674 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:30:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2822621828' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:44 compute-1 ceph-mon[75484]: pgmap v1562: 353 pgs: 353 active+clean; 113 MiB data, 349 MiB used, 40 GiB / 40 GiB avail; 735 KiB/s rd, 2.0 MiB/s wr, 64 op/s
Sep 30 18:30:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2209848134' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.177 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.185 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.188 2 DEBUG oslo_concurrency.lockutils [req-899629bd-2853-4846-aff7-74649761d8f1 req-4a902626-a090-4744-8ee4-4458340d948c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.189 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquired lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.189 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:30:44 compute-1 podman[287775]: 2025-09-30 18:30:44.557967107 +0000 UTC m=+0.091162795 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:30:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:44.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:44 compute-1 podman[287774]: 2025-09-30 18:30:44.618335069 +0000 UTC m=+0.153117820 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 18:30:44 compute-1 nova_compute[238822]: 2025-09-30 18:30:44.696 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:30:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2822621828' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:45 compute-1 nova_compute[238822]: 2025-09-30 18:30:45.213 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:30:45 compute-1 nova_compute[238822]: 2025-09-30 18:30:45.213 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.855s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:45 compute-1 nova_compute[238822]: 2025-09-30 18:30:45.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:45 compute-1 nova_compute[238822]: 2025-09-30 18:30:45.480 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:30:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:46 compute-1 ceph-mon[75484]: pgmap v1563: 353 pgs: 353 active+clean; 167 MiB data, 371 MiB used, 40 GiB / 40 GiB avail; 814 KiB/s rd, 3.9 MiB/s wr, 106 op/s
Sep 30 18:30:46 compute-1 nova_compute[238822]: 2025-09-30 18:30:46.527 2 WARNING neutronclient.v2_0.client [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:46.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:46 compute-1 nova_compute[238822]: 2025-09-30 18:30:46.724 2 DEBUG nova.network.neutron [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updating instance_info_cache with network_info: [{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:30:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/953508383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:30:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:30:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.232 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Releasing lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.233 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance network_info: |[{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.237 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Start _get_guest_xml network_info=[{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.242 2 WARNING nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.244 2 DEBUG nova.virt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-876288154', uuid='5321575c-f6c1-4500-adf7-285c22df2e73'), owner=OwnerMeta(userid='623ef4a55c9e4fc28bb65e49246b5008', username='tempest-TestExecuteStrategies-1883747907-project-admin', projectid='c634e1c17ed54907969576a0eb8eff50', projectname='tempest-TestExecuteStrategies-1883747907'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759257047.2444055) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.250 2 DEBUG nova.virt.libvirt.host [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.251 2 DEBUG nova.virt.libvirt.host [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.255 2 DEBUG nova.virt.libvirt.host [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.256 2 DEBUG nova.virt.libvirt.host [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.256 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.257 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.257 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.258 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.258 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.259 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.259 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.259 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.260 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.260 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.261 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.261 2 DEBUG nova.virt.hardware [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.266 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:47 compute-1 sudo[287827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:30:47 compute-1 sudo[287827]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:47 compute-1 sudo[287827]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:30:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1857050196' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.713 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.750 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:47 compute-1 nova_compute[238822]: 2025-09-30 18:30:47.755 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:30:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3195939406' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.200 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.203 2 DEBUG nova.virt.libvirt.vif [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-876288154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-876288154',id=23,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ce22n40f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:30:37Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=5321575c-f6c1-4500-adf7-285c22df2e73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.204 2 DEBUG nova.network.os_vif_util [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:30:48 compute-1 ceph-mon[75484]: pgmap v1564: 353 pgs: 353 active+clean; 167 MiB data, 371 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 93 op/s
Sep 30 18:30:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1857050196' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3195939406' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.205 2 DEBUG nova.network.os_vif_util [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.207 2 DEBUG nova.objects.instance [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5321575c-f6c1-4500-adf7-285c22df2e73 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.213 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.214 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.214 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:48.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.717 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <uuid>5321575c-f6c1-4500-adf7-285c22df2e73</uuid>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <name>instance-00000017</name>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-876288154</nova:name>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:30:47</nova:creationTime>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:30:48 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:30:48 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <nova:port uuid="afaa4f9e-eab6-432e-9b39-d80bb074577d">
Sep 30 18:30:48 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <system>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="serial">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="uuid">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </system>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <os>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </os>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <features>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </features>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </source>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk.config">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </source>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:30:48 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:be:37:f0"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <target dev="tapafaa4f9e-ea"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <video>
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </video>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:30:48 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:30:48 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:30:48 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:30:48 compute-1 nova_compute[238822]: </domain>
Sep 30 18:30:48 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.718 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Preparing to wait for external event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.718 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.719 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.719 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.720 2 DEBUG nova.virt.libvirt.vif [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-876288154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-876288154',id=23,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ce22n40f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:30:37Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=5321575c-f6c1-4500-adf7-285c22df2e73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.721 2 DEBUG nova.network.os_vif_util [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.722 2 DEBUG nova.network.os_vif_util [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.722 2 DEBUG os_vif [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.724 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'af99def9-07df-5fb6-bc06-1985311686d7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.733 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafaa4f9e-ea, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapafaa4f9e-ea, col_values=(('qos', UUID('b27914bd-a892-4e9a-b9b1-46b8d786241c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapafaa4f9e-ea, col_values=(('external_ids', {'iface-id': 'afaa4f9e-eab6-432e-9b39-d80bb074577d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:37:f0', 'vm-uuid': '5321575c-f6c1-4500-adf7-285c22df2e73'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 NetworkManager[45549]: <info>  [1759257048.7377] manager: (tapafaa4f9e-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:48 compute-1 nova_compute[238822]: 2025-09-30 18:30:48.748 2 INFO os_vif [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea')
Sep 30 18:30:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:49 compute-1 nova_compute[238822]: 2025-09-30 18:30:49.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: ERROR   18:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: ERROR   18:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: ERROR   18:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: ERROR   18:30:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: ERROR   18:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:30:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:30:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:30:50 compute-1 ceph-mon[75484]: pgmap v1565: 353 pgs: 353 active+clean; 167 MiB data, 371 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 93 op/s
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.296 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.297 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.297 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] No VIF found with MAC fa:16:3e:be:37:f0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.298 2 INFO nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Using config drive
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.335 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:50 compute-1 podman[287937]: 2025-09-30 18:30:50.517951761 +0000 UTC m=+0.062782228 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 18:30:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:50.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:50 compute-1 nova_compute[238822]: 2025-09-30 18:30:50.853 2 WARNING neutronclient.v2_0.client [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:30:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:51.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.464 2 INFO nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Creating config drive at /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.473 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp8qqf15wd execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.615 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp8qqf15wd" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.658 2 DEBUG nova.storage.rbd_utils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] rbd image 5321575c-f6c1-4500-adf7-285c22df2e73_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.663 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config 5321575c-f6c1-4500-adf7-285c22df2e73_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.875 2 DEBUG oslo_concurrency.processutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config 5321575c-f6c1-4500-adf7-285c22df2e73_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.212s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.876 2 INFO nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Deleting local config drive /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/disk.config because it was imported into RBD.
Sep 30 18:30:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:51 compute-1 kernel: tapafaa4f9e-ea: entered promiscuous mode
Sep 30 18:30:51 compute-1 NetworkManager[45549]: <info>  [1759257051.9393] manager: (tapafaa4f9e-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Sep 30 18:30:51 compute-1 ovn_controller[135204]: 2025-09-30T18:30:51Z|00189|binding|INFO|Claiming lport afaa4f9e-eab6-432e-9b39-d80bb074577d for this chassis.
Sep 30 18:30:51 compute-1 ovn_controller[135204]: 2025-09-30T18:30:51Z|00190|binding|INFO|afaa4f9e-eab6-432e-9b39-d80bb074577d: Claiming fa:16:3e:be:37:f0 10.100.0.10
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.949 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:37:f0 10.100.0.10'], port_security=['fa:16:3e:be:37:f0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5321575c-f6c1-4500-adf7-285c22df2e73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=afaa4f9e-eab6-432e-9b39-d80bb074577d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.950 144543 INFO neutron.agent.ovn.metadata.agent [-] Port afaa4f9e-eab6-432e-9b39-d80bb074577d in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 bound to our chassis
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.952 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:30:51 compute-1 ovn_controller[135204]: 2025-09-30T18:30:51Z|00191|binding|INFO|Setting lport afaa4f9e-eab6-432e-9b39-d80bb074577d ovn-installed in OVS
Sep 30 18:30:51 compute-1 ovn_controller[135204]: 2025-09-30T18:30:51Z|00192|binding|INFO|Setting lport afaa4f9e-eab6-432e-9b39-d80bb074577d up in Southbound
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.966 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[90d47dd2-c11c-4143-ba87-09b9b9a14992]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:51 compute-1 nova_compute[238822]: 2025-09-30 18:30:51.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.967 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.970 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.970 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bffde2c1-ed34-4698-b0ab-096cd70fe7ca]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.971 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[61da678f-7efb-45d0-9f7b-d0f3f087c909]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:51.990 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[6b589c3f-d0fd-410b-8c8d-ea1a0fde5018]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:51 compute-1 systemd-machined[195911]: New machine qemu-17-instance-00000017.
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.010 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea71470-ba9b-4b3c-b022-45ebffe230bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Sep 30 18:30:52 compute-1 systemd-udevd[288016]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:30:52 compute-1 NetworkManager[45549]: <info>  [1759257052.0407] device (tapafaa4f9e-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:30:52 compute-1 NetworkManager[45549]: <info>  [1759257052.0422] device (tapafaa4f9e-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.053 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[277ea97f-5644-454b-b9c5-70edfbf36ff2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.061 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d8dd789c-7a12-4585-bc1f-1144e1456ace]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 systemd-udevd[288018]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:30:52 compute-1 NetworkManager[45549]: <info>  [1759257052.0626] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.106 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[bc54a70f-4661-4854-a53a-d839a81e54c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.110 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[64961fe0-7d64-47d3-a485-9b919102e774]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 NetworkManager[45549]: <info>  [1759257052.1439] device (tap6901f664-30): carrier: link connected
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.150 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac8e1f5-a278-4fa3-a4a2-e09e4f13880c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.175 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[72ce68c9-acef-4ac0-b965-e2b4d11ebcdf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1489372, 'reachable_time': 38936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288045, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.193 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[39a77d4c-72d2-4649-83aa-191d4771aa2e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1489372, 'tstamp': 1489372}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288046, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.212 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[df485429-222f-488f-addf-1a2be69da297]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1489372, 'reachable_time': 38936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288047, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ceph-mon[75484]: pgmap v1566: 353 pgs: 353 active+clean; 167 MiB data, 371 MiB used, 40 GiB / 40 GiB avail; 229 KiB/s rd, 2.8 MiB/s wr, 65 op/s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.249 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[48b05bfb-c8e6-4f7e-8311-d8ff2a119756]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.331 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4ec4f4-0922-49a1-baca-b25d5dafa518]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.332 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.333 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.333 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:52 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:30:52 compute-1 NetworkManager[45549]: <info>  [1759257052.3388] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.341 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:52 compute-1 ovn_controller[135204]: 2025-09-30T18:30:52Z|00193|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.377 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8e453150-dd2f-4c8e-81bc-35d1090a8ea3]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.378 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.378 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.378 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.379 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.380 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[419de29c-56a8-48c9-b8d5-9d881e31cedb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.381 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.381 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6936d728-3060-4124-91af-9cfdb14d99c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.382 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:30:52 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:52.383 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.552 2 DEBUG nova.compute.manager [req-fb16a513-1132-4303-b0f6-6222b3d3b892 req-5d197a8f-c9d6-433a-a754-2c65199d9f27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.554 2 DEBUG oslo_concurrency.lockutils [req-fb16a513-1132-4303-b0f6-6222b3d3b892 req-5d197a8f-c9d6-433a-a754-2c65199d9f27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.555 2 DEBUG oslo_concurrency.lockutils [req-fb16a513-1132-4303-b0f6-6222b3d3b892 req-5d197a8f-c9d6-433a-a754-2c65199d9f27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.556 2 DEBUG oslo_concurrency.lockutils [req-fb16a513-1132-4303-b0f6-6222b3d3b892 req-5d197a8f-c9d6-433a-a754-2c65199d9f27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:52 compute-1 nova_compute[238822]: 2025-09-30 18:30:52.556 2 DEBUG nova.compute.manager [req-fb16a513-1132-4303-b0f6-6222b3d3b892 req-5d197a8f-c9d6-433a-a754-2c65199d9f27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Processing event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:30:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:52.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:52 compute-1 podman[288095]: 2025-09-30 18:30:52.788896077 +0000 UTC m=+0.057313030 container create 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 18:30:52 compute-1 systemd[1]: Started libpod-conmon-5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8.scope.
Sep 30 18:30:52 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:30:52 compute-1 podman[288095]: 2025-09-30 18:30:52.758129855 +0000 UTC m=+0.026546838 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:30:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1df4b84602de9f8f28cb63b0d93cb8a6b0aef74ef35621ca0cff5e872368bf16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:30:52 compute-1 podman[288095]: 2025-09-30 18:30:52.880863562 +0000 UTC m=+0.149280545 container init 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:30:52 compute-1 podman[288095]: 2025-09-30 18:30:52.887486621 +0000 UTC m=+0.155903574 container start 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:30:52 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [NOTICE]   (288140) : New worker (288143) forked
Sep 30 18:30:52 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [NOTICE]   (288140) : Loading success.
Sep 30 18:30:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.401 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.407 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.412 2 INFO nova.virt.libvirt.driver [-] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance spawned successfully.
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.413 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:30:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:53.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.928 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.928 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.929 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.929 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.929 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 nova_compute[238822]: 2025-09-30 18:30:53.930 2 DEBUG nova.virt.libvirt.driver [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:30:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:54 compute-1 ceph-mon[75484]: pgmap v1567: 353 pgs: 353 active+clean; 167 MiB data, 371 MiB used, 40 GiB / 40 GiB avail; 79 KiB/s rd, 1.9 MiB/s wr, 42 op/s
Sep 30 18:30:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:54.391 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:54.392 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:30:54.392 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.442 2 INFO nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Took 16.04 seconds to spawn the instance on the hypervisor.
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.443 2 DEBUG nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:30:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:54.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.631 2 DEBUG nova.compute.manager [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.632 2 DEBUG oslo_concurrency.lockutils [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.632 2 DEBUG oslo_concurrency.lockutils [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.633 2 DEBUG oslo_concurrency.lockutils [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.633 2 DEBUG nova.compute.manager [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No waiting events found dispatching network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.634 2 WARNING nova.compute.manager [req-93ee30c4-68f6-45f8-8378-b4e4039374c9 req-b8670574-2baa-4b13-990e-5d2b89d1a73f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received unexpected event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with vm_state active and task_state None.
Sep 30 18:30:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:54 compute-1 nova_compute[238822]: 2025-09-30 18:30:54.981 2 INFO nova.compute.manager [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Took 21.81 seconds to build instance.
Sep 30 18:30:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:30:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:55.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:30:55 compute-1 nova_compute[238822]: 2025-09-30 18:30:55.491 2 DEBUG oslo_concurrency.lockutils [None req-6857aa12-a015-4b16-8b9c-750cf0f7485f 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.339s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:30:55 compute-1 sudo[288157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:30:55 compute-1 sudo[288157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:30:55 compute-1 sudo[288157]: pam_unix(sudo:session): session closed for user root
Sep 30 18:30:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:56 compute-1 unix_chkpwd[288182]: password check failed for user (root)
Sep 30 18:30:56 compute-1 sshd-session[288155]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=84.51.43.58  user=root
Sep 30 18:30:56 compute-1 ceph-mon[75484]: pgmap v1568: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 86 KiB/s rd, 1.9 MiB/s wr, 52 op/s
Sep 30 18:30:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:56.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:57 compute-1 podman[288185]: 2025-09-30 18:30:57.565848361 +0000 UTC m=+0.099431989 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:30:57 compute-1 podman[288187]: 2025-09-30 18:30:57.577113775 +0000 UTC m=+0.101578396 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Sep 30 18:30:57 compute-1 podman[288186]: 2025-09-30 18:30:57.577173887 +0000 UTC m=+0.105499063 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Sep 30 18:30:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:58 compute-1 sshd-session[288155]: Failed password for root from 84.51.43.58 port 58220 ssh2
Sep 30 18:30:58 compute-1 ceph-mon[75484]: pgmap v1569: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 28 KiB/s wr, 11 op/s
Sep 30 18:30:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3237373477' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:30:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3237373477' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:30:58 compute-1 nova_compute[238822]: 2025-09-30 18:30:58.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:30:58.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:58 compute-1 nova_compute[238822]: 2025-09-30 18:30:58.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:30:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:30:58 compute-1 sshd-session[288155]: Received disconnect from 84.51.43.58 port 58220:11: Bye Bye [preauth]
Sep 30 18:30:58 compute-1 sshd-session[288155]: Disconnected from authenticating user root 84.51.43.58 port 58220 [preauth]
Sep 30 18:30:59 compute-1 sshd-session[288248]: Invalid user devuser from 192.210.160.141 port 55052
Sep 30 18:30:59 compute-1 sshd-session[288248]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:30:59 compute-1 sshd-session[288248]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:30:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:30:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:30:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:30:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:30:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:30:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:30:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:30:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:00 compute-1 ceph-mon[75484]: pgmap v1570: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 706 KiB/s rd, 28 KiB/s wr, 33 op/s
Sep 30 18:31:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:00.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:00 compute-1 sshd-session[288248]: Failed password for invalid user devuser from 192.210.160.141 port 55052 ssh2
Sep 30 18:31:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:01.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:02 compute-1 ceph-mon[75484]: pgmap v1571: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 75 op/s
Sep 30 18:31:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:02.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:02 compute-1 sshd-session[288248]: Connection closed by invalid user devuser 192.210.160.141 port 55052 [preauth]
Sep 30 18:31:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:03 compute-1 nova_compute[238822]: 2025-09-30 18:31:03.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:03.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:03 compute-1 nova_compute[238822]: 2025-09-30 18:31:03.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:04 compute-1 ceph-mon[75484]: pgmap v1572: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 18 KiB/s wr, 74 op/s
Sep 30 18:31:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:04.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:05.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:05 compute-1 podman[249638]: time="2025-09-30T18:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:31:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:31:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8815 "" "Go-http-client/1.1"
Sep 30 18:31:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:06 compute-1 ovn_controller[135204]: 2025-09-30T18:31:06Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:37:f0 10.100.0.10
Sep 30 18:31:06 compute-1 ovn_controller[135204]: 2025-09-30T18:31:06Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:37:f0 10.100.0.10
Sep 30 18:31:06 compute-1 ceph-mon[75484]: pgmap v1573: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 18 KiB/s wr, 75 op/s
Sep 30 18:31:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:06.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:31:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:07.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:08 compute-1 ceph-mon[75484]: pgmap v1574: 353 pgs: 353 active+clean; 167 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 65 op/s
Sep 30 18:31:08 compute-1 nova_compute[238822]: 2025-09-30 18:31:08.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:08.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:08 compute-1 nova_compute[238822]: 2025-09-30 18:31:08.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:09.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:10 compute-1 ceph-mon[75484]: pgmap v1575: 353 pgs: 353 active+clean; 186 MiB data, 387 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 1.3 MiB/s wr, 102 op/s
Sep 30 18:31:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:10.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:11.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:12 compute-1 ceph-mon[75484]: pgmap v1576: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 1.4 MiB/s rd, 2.1 MiB/s wr, 100 op/s
Sep 30 18:31:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:12.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:13 compute-1 nova_compute[238822]: 2025-09-30 18:31:13.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:13.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:13 compute-1 unix_chkpwd[288271]: password check failed for user (root)
Sep 30 18:31:13 compute-1 sshd-session[288267]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:31:13 compute-1 nova_compute[238822]: 2025-09-30 18:31:13.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:14 compute-1 ceph-mon[75484]: pgmap v1577: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 183 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Sep 30 18:31:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:14.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:14 compute-1 sshd-session[288269]: Invalid user seekcy from 14.225.167.110 port 56952
Sep 30 18:31:14 compute-1 sshd-session[288269]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:31:14 compute-1 sshd-session[288269]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:31:15 compute-1 podman[288275]: 2025-09-30 18:31:15.045964551 +0000 UTC m=+0.085469001 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:31:15 compute-1 podman[288274]: 2025-09-30 18:31:15.065275353 +0000 UTC m=+0.112470991 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 18:31:15 compute-1 sshd-session[288267]: Failed password for root from 8.243.64.201 port 59190 ssh2
Sep 30 18:31:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:15.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:16 compute-1 sudo[288324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:31:16 compute-1 sudo[288324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:16 compute-1 sudo[288324]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:16 compute-1 ceph-mon[75484]: pgmap v1578: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 184 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Sep 30 18:31:16 compute-1 sshd-session[288269]: Failed password for invalid user seekcy from 14.225.167.110 port 56952 ssh2
Sep 30 18:31:16 compute-1 sshd-session[288267]: Received disconnect from 8.243.64.201 port 59190:11: Bye Bye [preauth]
Sep 30 18:31:16 compute-1 sshd-session[288267]: Disconnected from authenticating user root 8.243.64.201 port 59190 [preauth]
Sep 30 18:31:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:16.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:17 compute-1 sshd-session[288269]: Received disconnect from 14.225.167.110 port 56952:11: Bye Bye [preauth]
Sep 30 18:31:17 compute-1 sshd-session[288269]: Disconnected from invalid user seekcy 14.225.167.110 port 56952 [preauth]
Sep 30 18:31:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:17.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:18 compute-1 nova_compute[238822]: 2025-09-30 18:31:18.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:18 compute-1 ceph-mon[75484]: pgmap v1579: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 180 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Sep 30 18:31:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:18.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:18 compute-1 nova_compute[238822]: 2025-09-30 18:31:18.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:19 compute-1 nova_compute[238822]: 2025-09-30 18:31:19.299 2 DEBUG nova.compute.manager [None req-190a7000-d662-4698-913b-4ebd4861aa43 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 18:31:19 compute-1 nova_compute[238822]: 2025-09-30 18:31:19.374 2 DEBUG nova.compute.provider_tree [None req-190a7000-d662-4698-913b-4ebd4861aa43 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 28 to 29 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: ERROR   18:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: ERROR   18:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: ERROR   18:31:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: ERROR   18:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: ERROR   18:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:31:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:31:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:19.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:19 compute-1 sshd-session[288352]: Invalid user test from 216.10.242.161 port 43726
Sep 30 18:31:19 compute-1 sshd-session[288352]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:31:19 compute-1 sshd-session[288352]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:31:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:20 compute-1 ceph-mon[75484]: pgmap v1580: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 180 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Sep 30 18:31:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:21 compute-1 sshd-session[288352]: Failed password for invalid user test from 216.10.242.161 port 43726 ssh2
Sep 30 18:31:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:21.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:21 compute-1 podman[288358]: 2025-09-30 18:31:21.531612054 +0000 UTC m=+0.072709945 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:31:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:21 compute-1 ovn_controller[135204]: 2025-09-30T18:31:21Z|00194|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Sep 30 18:31:22 compute-1 sshd-session[288352]: Received disconnect from 216.10.242.161 port 43726:11: Bye Bye [preauth]
Sep 30 18:31:22 compute-1 sshd-session[288352]: Disconnected from invalid user test 216.10.242.161 port 43726 [preauth]
Sep 30 18:31:22 compute-1 ceph-mon[75484]: pgmap v1581: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 79 KiB/s rd, 822 KiB/s wr, 21 op/s
Sep 30 18:31:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:31:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:22.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:23 compute-1 nova_compute[238822]: 2025-09-30 18:31:23.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:23.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:23 compute-1 nova_compute[238822]: 2025-09-30 18:31:23.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:24 compute-1 ceph-mon[75484]: pgmap v1582: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:31:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:24.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:25 compute-1 unix_chkpwd[288386]: password check failed for user (root)
Sep 30 18:31:25 compute-1 sshd-session[288380]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:31:25 compute-1 unix_chkpwd[288387]: password check failed for user (root)
Sep 30 18:31:25 compute-1 sshd-session[288381]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:31:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:25.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:25 compute-1 ceph-mon[75484]: pgmap v1583: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:31:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:26 compute-1 sshd-session[288380]: Failed password for root from 192.210.160.141 port 50842 ssh2
Sep 30 18:31:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:26.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:26 compute-1 sshd-session[288381]: Failed password for root from 103.153.190.105 port 58226 ssh2
Sep 30 18:31:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:27.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:27 compute-1 nova_compute[238822]: 2025-09-30 18:31:27.629 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Check if temp file /var/lib/nova/instances/tmpbmq9sq_z exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 18:31:27 compute-1 nova_compute[238822]: 2025-09-30 18:31:27.634 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpbmq9sq_z',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5321575c-f6c1-4500-adf7-285c22df2e73',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 18:31:27 compute-1 ceph-mon[75484]: pgmap v1584: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:31:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:28 compute-1 sshd-session[288380]: Connection closed by authenticating user root 192.210.160.141 port 50842 [preauth]
Sep 30 18:31:28 compute-1 sshd-session[288381]: Received disconnect from 103.153.190.105 port 58226:11: Bye Bye [preauth]
Sep 30 18:31:28 compute-1 sshd-session[288381]: Disconnected from authenticating user root 103.153.190.105 port 58226 [preauth]
Sep 30 18:31:28 compute-1 nova_compute[238822]: 2025-09-30 18:31:28.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:28 compute-1 podman[288392]: 2025-09-30 18:31:28.57751733 +0000 UTC m=+0.087921798 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 18:31:28 compute-1 podman[288391]: 2025-09-30 18:31:28.58010363 +0000 UTC m=+0.093518739 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid)
Sep 30 18:31:28 compute-1 podman[288393]: 2025-09-30 18:31:28.594494519 +0000 UTC m=+0.096044828 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:31:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:28.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:28 compute-1 nova_compute[238822]: 2025-09-30 18:31:28.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:29.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:29 compute-1 ceph-mon[75484]: pgmap v1585: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:31:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:30.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:31.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:31 compute-1 ceph-mon[75484]: pgmap v1586: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:31:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:32 compute-1 nova_compute[238822]: 2025-09-30 18:31:32.600 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Preparing to wait for external event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:31:32 compute-1 nova_compute[238822]: 2025-09-30 18:31:32.601 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:32 compute-1 nova_compute[238822]: 2025-09-30 18:31:32.601 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:32 compute-1 nova_compute[238822]: 2025-09-30 18:31:32.601 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:32.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:33 compute-1 nova_compute[238822]: 2025-09-30 18:31:33.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:33.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:33 compute-1 ceph-mon[75484]: pgmap v1587: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:31:33 compute-1 nova_compute[238822]: 2025-09-30 18:31:33.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:34.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:35.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:35 compute-1 podman[249638]: time="2025-09-30T18:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:31:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:31:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8820 "" "Go-http-client/1.1"
Sep 30 18:31:35 compute-1 ceph-mon[75484]: pgmap v1588: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:31:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:36 compute-1 sudo[288454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:31:36 compute-1 sudo[288454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:36 compute-1 sudo[288454]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:36.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4234911310' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:31:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4234911310' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:31:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:37.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:37 compute-1 ceph-mon[75484]: pgmap v1589: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:31:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:31:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:38 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:31:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:38.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.855 2 DEBUG nova.compute.manager [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.855 2 DEBUG oslo_concurrency.lockutils [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.855 2 DEBUG oslo_concurrency.lockutils [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.855 2 DEBUG oslo_concurrency.lockutils [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.856 2 DEBUG nova.compute.manager [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No event matching network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d in dict_keys([('network-vif-plugged', 'afaa4f9e-eab6-432e-9b39-d80bb074577d')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 18:31:38 compute-1 nova_compute[238822]: 2025-09-30 18:31:38.856 2 DEBUG nova.compute.manager [req-e0947143-48db-42fd-8ee6-a1c13c033008 req-ce93e43d-3252-4d86-95f3-3ea408aa30fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:31:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:39.050 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:31:39 compute-1 nova_compute[238822]: 2025-09-30 18:31:39.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:39.051 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:31:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:39.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:39 compute-1 ceph-mon[75484]: pgmap v1590: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:31:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.128 2 INFO nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Took 7.53 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 18:31:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:40.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.919 2 DEBUG nova.compute.manager [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.920 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.920 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.921 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.921 2 DEBUG nova.compute.manager [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Processing event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.921 2 DEBUG nova.compute.manager [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-changed-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.922 2 DEBUG nova.compute.manager [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Refreshing instance network info cache due to event network-changed-afaa4f9e-eab6-432e-9b39-d80bb074577d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.922 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.922 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.923 2 DEBUG nova.network.neutron [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Refreshing network info cache for port afaa4f9e-eab6-432e-9b39-d80bb074577d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:31:40 compute-1 nova_compute[238822]: 2025-09-30 18:31:40.924 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:31:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.435 2 WARNING neutronclient.v2_0.client [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.441 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpbmq9sq_z',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5321575c-f6c1-4500-adf7-285c22df2e73',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1f9d60b1-2650-404b-96aa-1154ab475694),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.448 2 DEBUG nova.objects.instance [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'migration_context' on Instance uuid 5321575c-f6c1-4500-adf7-285c22df2e73 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.449 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.453 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.453 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:31:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:41.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:41 compute-1 ceph-mon[75484]: pgmap v1591: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 3.3 KiB/s wr, 1 op/s
Sep 30 18:31:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.956 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.956 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.965 2 DEBUG nova.virt.libvirt.vif [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-876288154',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-876288154',id=23,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:30:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ce22n40f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:30:54Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=5321575c-f6c1-4500-adf7-285c22df2e73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.966 2 DEBUG nova.network.os_vif_util [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.967 2 DEBUG nova.network.os_vif_util [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.968 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <mac address="fa:16:3e:be:37:f0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <model type="virtio"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <mtu size="1442"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <target dev="tapafaa4f9e-ea"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]: </interface>
Sep 30 18:31:41 compute-1 nova_compute[238822]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.969 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <name>instance-00000017</name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <uuid>5321575c-f6c1-4500-adf7-285c22df2e73</uuid>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-876288154</nova:name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:30:47</nova:creationTime>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:port uuid="afaa4f9e-eab6-432e-9b39-d80bb074577d">
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="serial">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="uuid">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk.config">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:be:37:f0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapafaa4f9e-ea"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </target>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </console>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </input>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]: </domain>
Sep 30 18:31:41 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.971 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <name>instance-00000017</name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <uuid>5321575c-f6c1-4500-adf7-285c22df2e73</uuid>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-876288154</nova:name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:30:47</nova:creationTime>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:port uuid="afaa4f9e-eab6-432e-9b39-d80bb074577d">
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="serial">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="uuid">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk.config">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:be:37:f0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapafaa4f9e-ea"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </target>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </console>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </input>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]: </domain>
Sep 30 18:31:41 compute-1 nova_compute[238822]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.972 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <name>instance-00000017</name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <uuid>5321575c-f6c1-4500-adf7-285c22df2e73</uuid>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteStrategies-server-876288154</nova:name>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:30:47</nova:creationTime>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:user uuid="623ef4a55c9e4fc28bb65e49246b5008">tempest-TestExecuteStrategies-1883747907-project-admin</nova:user>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:project uuid="c634e1c17ed54907969576a0eb8eff50">tempest-TestExecuteStrategies-1883747907</nova:project>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <nova:port uuid="afaa4f9e-eab6-432e-9b39-d80bb074577d">
Sep 30 18:31:41 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <memory unit="KiB">131072</memory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <vcpu placement="static">1</vcpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <partition>/machine</partition>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </resource>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="serial">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="uuid">5321575c-f6c1-4500-adf7-285c22df2e73</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </system>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </os>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <vmcoreinfo state="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </features>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <cpu mode="host-model" check="partial">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_poweroff>destroy</on_poweroff>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_reboot>restart</on_reboot>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <on_crash>destroy</on_crash>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/5321575c-f6c1-4500-adf7-285c22df2e73_disk.config">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </source>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <readonly/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="1" port="0x10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="2" port="0x11"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="3" port="0x12"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="4" port="0x13"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="5" port="0x14"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="6" port="0x15"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="7" port="0x16"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="8" port="0x17"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="9" port="0x18"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="10" port="0x19"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="11" port="0x1a"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="12" port="0x1b"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="13" port="0x1c"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="14" port="0x1d"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="15" port="0x1e"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="16" port="0x1f"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="17" port="0x20"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="18" port="0x21"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="19" port="0x22"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="20" port="0x23"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="21" port="0x24"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="22" port="0x25"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="23" port="0x26"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="24" port="0x27"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-root-port"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target chassis="25" port="0x28"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model name="pcie-pci-bridge"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <controller type="sata" index="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </controller>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <interface type="ethernet"><mac address="fa:16:3e:be:37:f0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapafaa4f9e-ea"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </interface><serial type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="isa-serial" port="0">
Sep 30 18:31:41 compute-1 nova_compute[238822]:         <model name="isa-serial"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       </target>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <console type="pty">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73/console.log" append="off"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <target type="serial" port="0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </console>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="usb" bus="0" port="1"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </input>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <input type="mouse" bus="ps2"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <listen type="address" address="::"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </graphics>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </video>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:31:41 compute-1 nova_compute[238822]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:31:41 compute-1 nova_compute[238822]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 18:31:41 compute-1 nova_compute[238822]: </domain>
Sep 30 18:31:41 compute-1 nova_compute[238822]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 18:31:41 compute-1 nova_compute[238822]: 2025-09-30 18:31:41.973 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 18:31:42 compute-1 nova_compute[238822]: 2025-09-30 18:31:42.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:42 compute-1 nova_compute[238822]: 2025-09-30 18:31:42.460 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Current None elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:31:42 compute-1 nova_compute[238822]: 2025-09-30 18:31:42.461 2 INFO nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 18:31:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:42.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:43 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:43.053 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.476 2 WARNING neutronclient.v2_0.client [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.484 2 INFO nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:43.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.569 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.570 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.571 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:31:43 compute-1 ceph-mon[75484]: pgmap v1592: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:31:43 compute-1 nova_compute[238822]: 2025-09-30 18:31:43.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.011 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.013 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 18:31:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:31:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3294471334' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.080 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.488 2 DEBUG nova.network.neutron [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updated VIF entry in instance network info cache for port afaa4f9e-eab6-432e-9b39-d80bb074577d. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.489 2 DEBUG nova.network.neutron [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updating instance_info_cache with network_info: [{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.518 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Current 50 elapsed 3 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.518 2 DEBUG nova.virt.libvirt.migration [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 18:31:44 compute-1 kernel: tapafaa4f9e-ea (unregistering): left promiscuous mode
Sep 30 18:31:44 compute-1 NetworkManager[45549]: <info>  [1759257104.5769] device (tapafaa4f9e-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:31:44 compute-1 ovn_controller[135204]: 2025-09-30T18:31:44Z|00195|binding|INFO|Releasing lport afaa4f9e-eab6-432e-9b39-d80bb074577d from this chassis (sb_readonly=0)
Sep 30 18:31:44 compute-1 ovn_controller[135204]: 2025-09-30T18:31:44Z|00196|binding|INFO|Setting lport afaa4f9e-eab6-432e-9b39-d80bb074577d down in Southbound
Sep 30 18:31:44 compute-1 ovn_controller[135204]: 2025-09-30T18:31:44Z|00197|binding|INFO|Removing iface tapafaa4f9e-ea ovn-installed in OVS
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.604 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:37:f0 10.100.0.10'], port_security=['fa:16:3e:be:37:f0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0398922-aff5-46ba-afa7-58d09e28293c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5321575c-f6c1-4500-adf7-285c22df2e73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=afaa4f9e-eab6-432e-9b39-d80bb074577d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.606 144543 INFO neutron.agent.ovn.metadata.agent [-] Port afaa4f9e-eab6-432e-9b39-d80bb074577d in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.608 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.610 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3965781c-0634-4951-b1d2-fb887ba2951f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.610 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:44 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Sep 30 18:31:44 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 15.883s CPU time.
Sep 30 18:31:44 compute-1 systemd-machined[195911]: Machine qemu-17-instance-00000017 terminated.
Sep 30 18:31:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:44.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:44 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_selinux on 5321575c-f6c1-4500-adf7-285c22df2e73_disk: No such file or directory
Sep 30 18:31:44 compute-1 virtqemud[239124]: Unable to get XATTR trusted.libvirt.security.ref_dac on 5321575c-f6c1-4500-adf7-285c22df2e73_disk: No such file or directory
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.764 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.765 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.765 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 18:31:44 compute-1 podman[288541]: 2025-09-30 18:31:44.794721693 +0000 UTC m=+0.059945522 container kill 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:31:44 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [NOTICE]   (288140) : haproxy version is 3.0.5-8e879a5
Sep 30 18:31:44 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [NOTICE]   (288140) : path to executable is /usr/sbin/haproxy
Sep 30 18:31:44 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [WARNING]  (288140) : Exiting Master process...
Sep 30 18:31:44 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [ALERT]    (288140) : Current worker (288143) exited with code 143 (Terminated)
Sep 30 18:31:44 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[288135]: [WARNING]  (288140) : All workers exited. Exiting... (0)
Sep 30 18:31:44 compute-1 systemd[1]: libpod-5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8.scope: Deactivated successfully.
Sep 30 18:31:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3294471334' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3761297730' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:44 compute-1 podman[288565]: 2025-09-30 18:31:44.867222833 +0000 UTC m=+0.039552561 container died 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:31:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-1df4b84602de9f8f28cb63b0d93cb8a6b0aef74ef35621ca0cff5e872368bf16-merged.mount: Deactivated successfully.
Sep 30 18:31:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8-userdata-shm.mount: Deactivated successfully.
Sep 30 18:31:44 compute-1 podman[288565]: 2025-09-30 18:31:44.922824505 +0000 UTC m=+0.095154233 container cleanup 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:31:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:44 compute-1 systemd[1]: libpod-conmon-5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8.scope: Deactivated successfully.
Sep 30 18:31:44 compute-1 podman[288568]: 2025-09-30 18:31:44.948037907 +0000 UTC m=+0.107800895 container remove 5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.964 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9fe977-e5ff-4485-8ad4-60d652175e9a]: (4, ("Tue Sep 30 06:31:44 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8)\n5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8\nTue Sep 30 06:31:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8)\n5b7e9ae134083344ba14fa3229a18fec0c72dc63be25e418f6f6686a745468e8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.966 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ba178493-f393-4840-abfb-57544dddec7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.966 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.967 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[eb740119-88eb-4406-abe2-c1e493787330]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.967 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:44 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:31:44 compute-1 nova_compute[238822]: 2025-09-30 18:31:44.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:44.987 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e18572fb-b2cd-4792-afbf-1dc6c358a07b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.004 2 DEBUG oslo_concurrency.lockutils [req-458ceb05-27da-4bac-9399-0c66f21445ac req-007978c3-ad0c-4366-8bc1-b1ead1e76786 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-5321575c-f6c1-4500-adf7-285c22df2e73" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:31:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:45.017 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[572e4284-c0be-4207-95e3-56ade602bd6c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:45.017 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4a0e1a-3f05-4246-8fa5-619fc160e957]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.021 2 DEBUG nova.virt.libvirt.guest [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '5321575c-f6c1-4500-adf7-285c22df2e73' (instance-00000017) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.022 2 INFO nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migration operation has completed
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.022 2 INFO nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] _post_live_migration() is started..
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.040 2 WARNING neutronclient.v2_0.client [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:31:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:45.040 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[35f82e3a-7b67-45f5-bd78-01b2dbfc161c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1489362, 'reachable_time': 39661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288600, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.041 2 WARNING neutronclient.v2_0.client [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:31:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:45.043 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:31:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:45.043 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[fae4b902-89c5-4506-96f7-0e79717ea069]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:31:45 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.142 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Error from libvirt while getting description of instance-00000017: [Error Code 42] Domain not found: no domain with matching uuid '5321575c-f6c1-4500-adf7-285c22df2e73' (instance-00000017): libvirt.libvirtError: Domain not found: no domain with matching uuid '5321575c-f6c1-4500-adf7-285c22df2e73' (instance-00000017)
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.362 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.366 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.400 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.401 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4543MB free_disk=39.901119232177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.401 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.402 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:45.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:45 compute-1 podman[288603]: 2025-09-30 18:31:45.558782596 +0000 UTC m=+0.090780325 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:31:45 compute-1 podman[288602]: 2025-09-30 18:31:45.603018762 +0000 UTC m=+0.137106057 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.621 2 DEBUG nova.compute.manager [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.622 2 DEBUG oslo_concurrency.lockutils [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.622 2 DEBUG oslo_concurrency.lockutils [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.622 2 DEBUG oslo_concurrency.lockutils [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.623 2 DEBUG nova.compute.manager [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No waiting events found dispatching network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:31:45 compute-1 nova_compute[238822]: 2025-09-30 18:31:45.623 2 DEBUG nova.compute.manager [req-3d879857-01bb-4c96-bce2-b7593ef6cf7c req-6a69c312-57b9-4d52-8e80-4227fcac0a71 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:31:45 compute-1 ceph-mon[75484]: pgmap v1593: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:31:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.427 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Updating resource usage from migration 1f9d60b1-2650-404b-96aa-1154ab475694
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.456 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration 1f9d60b1-2650-404b-96aa-1154ab475694 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.456 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.457 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:31:45 up  4:09,  0 user,  load average: 0.23, 0.37, 0.64\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.488 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:31:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.746 2 DEBUG nova.network.neutron [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Activated binding for port afaa4f9e-eab6-432e-9b39-d80bb074577d and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.747 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.750 2 DEBUG nova.virt.libvirt.vif [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-876288154',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-876288154',id=23,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:30:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ce22n40f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:31:22Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=5321575c-f6c1-4500-adf7-285c22df2e73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.750 2 DEBUG nova.network.os_vif_util [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "address": "fa:16:3e:be:37:f0", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafaa4f9e-ea", "ovs_interfaceid": "afaa4f9e-eab6-432e-9b39-d80bb074577d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.752 2 DEBUG nova.network.os_vif_util [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.752 2 DEBUG os_vif [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.756 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafaa4f9e-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.762 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b27914bd-a892-4e9a-b9b1-46b8d786241c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.768 2 INFO os_vif [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:37:f0,bridge_name='br-int',has_traffic_filtering=True,id=afaa4f9e-eab6-432e-9b39-d80bb074577d,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafaa4f9e-ea')
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.769 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:31:46 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/749390180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.977 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:31:46 compute-1 nova_compute[238822]: 2025-09-30 18:31:46.984 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.493 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:31:47 compute-1 sudo[288677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:31:47 compute-1 sudo[288677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:47 compute-1 sudo[288677]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:47 compute-1 sudo[288702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:31:47 compute-1 sudo[288702]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.868 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.868 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No waiting events found dispatching network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 WARNING nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received unexpected event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with vm_state active and task_state migrating.
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.869 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.870 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.870 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.870 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No waiting events found dispatching network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.870 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-unplugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.870 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.871 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.871 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.871 2 DEBUG oslo_concurrency.lockutils [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.871 2 DEBUG nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] No waiting events found dispatching network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:31:47 compute-1 nova_compute[238822]: 2025-09-30 18:31:47.871 2 WARNING nova.compute.manager [req-0c365939-fe03-45cd-bd2e-1ae10351e3b4 req-33ad0369-230c-4540-a0ce-d8ce00884a90 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Received unexpected event network-vif-plugged-afaa4f9e-eab6-432e-9b39-d80bb074577d for instance with vm_state active and task_state migrating.
Sep 30 18:31:47 compute-1 ceph-mon[75484]: pgmap v1594: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:31:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/749390180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.005 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.005 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.603s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.006 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 1.237s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.006 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.006 2 DEBUG nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.007 2 INFO nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Deleting instance files /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73_del
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.007 2 INFO nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Deletion of /var/lib/nova/instances/5321575c-f6c1-4500-adf7-285c22df2e73_del complete
Sep 30 18:31:48 compute-1 podman[288798]: 2025-09-30 18:31:48.325962686 +0000 UTC m=+0.090370834 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Sep 30 18:31:48 compute-1 podman[288798]: 2025-09-30 18:31:48.4615254 +0000 UTC m=+0.225933528 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 18:31:48 compute-1 nova_compute[238822]: 2025-09-30 18:31:48.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:48.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:48 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:31:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3912130262' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:49 compute-1 podman[288918]: 2025-09-30 18:31:49.165842058 +0000 UTC m=+0.087487576 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:31:49 compute-1 podman[288918]: 2025-09-30 18:31:49.174962595 +0000 UTC m=+0.096608113 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: ERROR   18:31:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: ERROR   18:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: ERROR   18:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: ERROR   18:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: ERROR   18:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:31:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:31:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:49.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:49 compute-1 ceph-mon[75484]: pgmap v1595: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:31:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:49 compute-1 podman[289057]: 2025-09-30 18:31:49.952742089 +0000 UTC m=+0.083030735 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:31:49 compute-1 podman[289057]: 2025-09-30 18:31:49.963959762 +0000 UTC m=+0.094248438 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:31:50 compute-1 nova_compute[238822]: 2025-09-30 18:31:50.009 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:50 compute-1 nova_compute[238822]: 2025-09-30 18:31:50.010 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:50 compute-1 nova_compute[238822]: 2025-09-30 18:31:50.010 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:50 compute-1 nova_compute[238822]: 2025-09-30 18:31:50.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:50 compute-1 podman[289125]: 2025-09-30 18:31:50.309032809 +0000 UTC m=+0.109437099 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, description=keepalived for Ceph, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Sep 30 18:31:50 compute-1 podman[289125]: 2025-09-30 18:31:50.325183256 +0000 UTC m=+0.125587486 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, architecture=x86_64, release=1793, version=2.2.4, com.redhat.component=keepalived-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, name=keepalived, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, description=keepalived for Ceph)
Sep 30 18:31:50 compute-1 sudo[288702]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:50.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:50 compute-1 sudo[289196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:31:50 compute-1 sudo[289196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:50 compute-1 sudo[289196]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:50 compute-1 sudo[289221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:31:50 compute-1 sudo[289221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:51 compute-1 nova_compute[238822]: 2025-09-30 18:31:51.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:31:51 compute-1 unix_chkpwd[289279]: password check failed for user (root)
Sep 30 18:31:51 compute-1 sshd-session[289059]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:31:51 compute-1 sudo[289221]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:51.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:51 compute-1 ceph-mon[75484]: pgmap v1596: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 9.2 KiB/s wr, 7 op/s
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:31:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:31:51 compute-1 nova_compute[238822]: 2025-09-30 18:31:51.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:52 compute-1 podman[289281]: 2025-09-30 18:31:52.564774944 +0000 UTC m=+0.109019938 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:31:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:52.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:52 compute-1 ceph-mon[75484]: pgmap v1597: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 5.4 KiB/s rd, 8.9 KiB/s wr, 7 op/s
Sep 30 18:31:52 compute-1 ceph-mon[75484]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Sep 30 18:31:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:31:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:53 compute-1 nova_compute[238822]: 2025-09-30 18:31:53.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:53.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:53 compute-1 sshd-session[289059]: Failed password for root from 192.210.160.141 port 52978 ssh2
Sep 30 18:31:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:54.394 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:54.395 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:31:54.395 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:54 compute-1 sshd-session[289059]: Connection closed by authenticating user root 192.210.160.141 port 52978 [preauth]
Sep 30 18:31:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:54.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:54 compute-1 ceph-mon[75484]: pgmap v1598: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 5.4 KiB/s rd, 8.9 KiB/s wr, 7 op/s
Sep 30 18:31:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:55.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:56 compute-1 sudo[289307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:31:56 compute-1 sudo[289307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:56 compute-1 sudo[289307]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:56 compute-1 sudo[289332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:31:56 compute-1 sudo[289332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:31:56 compute-1 sudo[289332]: pam_unix(sudo:session): session closed for user root
Sep 30 18:31:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:31:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:56.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:31:56 compute-1 nova_compute[238822]: 2025-09-30 18:31:56.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.040 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.041 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.041 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "5321575c-f6c1-4500-adf7-285c22df2e73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:57 compute-1 ceph-mon[75484]: pgmap v1599: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 747 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:31:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.556 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.557 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.557 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.558 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:31:57 compute-1 nova_compute[238822]: 2025-09-30 18:31:57.558 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:31:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:57.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:31:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1355569676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.042 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.293 2 WARNING nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.295 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.328 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.329 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4690MB free_disk=39.901119232177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.330 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.330 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:31:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1007055275' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:31:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1007055275' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:31:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1355569676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:31:58 compute-1 nova_compute[238822]: 2025-09-30 18:31:58.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:31:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:31:58.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:31:59 compute-1 nova_compute[238822]: 2025-09-30 18:31:59.354 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration for instance 5321575c-f6c1-4500-adf7-285c22df2e73 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:31:59 compute-1 ceph-mon[75484]: pgmap v1600: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 747 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:31:59 compute-1 podman[289384]: 2025-09-30 18:31:59.542729383 +0000 UTC m=+0.080358833 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:31:59 compute-1 podman[289386]: 2025-09-30 18:31:59.54743754 +0000 UTC m=+0.073026605 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 18:31:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:31:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:31:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:31:59.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:31:59 compute-1 podman[289385]: 2025-09-30 18:31:59.569465886 +0000 UTC m=+0.095317598 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41)
Sep 30 18:31:59 compute-1 nova_compute[238822]: 2025-09-30 18:31:59.866 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 18:31:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:31:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:31:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:31:59 compute-1 nova_compute[238822]: 2025-09-30 18:31:59.975 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Migration 1f9d60b1-2650-404b-96aa-1154ab475694 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 18:31:59 compute-1 nova_compute[238822]: 2025-09-30 18:31:59.976 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:31:59 compute-1 nova_compute[238822]: 2025-09-30 18:31:59.976 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:31:58 up  4:09,  0 user,  load average: 0.25, 0.37, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:32:00 compute-1 nova_compute[238822]: 2025-09-30 18:32:00.028 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:32:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:32:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1818660566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:00 compute-1 nova_compute[238822]: 2025-09-30 18:32:00.485 2 DEBUG oslo_concurrency.processutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:32:00 compute-1 nova_compute[238822]: 2025-09-30 18:32:00.493 2 DEBUG nova.compute.provider_tree [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:32:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:00.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:01 compute-1 nova_compute[238822]: 2025-09-30 18:32:01.003 2 DEBUG nova.scheduler.client.report [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:32:01 compute-1 ceph-mon[75484]: pgmap v1601: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 841 B/s rd, 0 op/s
Sep 30 18:32:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1818660566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:01 compute-1 nova_compute[238822]: 2025-09-30 18:32:01.516 2 DEBUG nova.compute.resource_tracker [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:32:01 compute-1 nova_compute[238822]: 2025-09-30 18:32:01.518 2 DEBUG oslo_concurrency.lockutils [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.188s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:32:01 compute-1 nova_compute[238822]: 2025-09-30 18:32:01.542 2 INFO nova.compute.manager [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 18:32:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:01.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:01 compute-1 nova_compute[238822]: 2025-09-30 18:32:01.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:02 compute-1 nova_compute[238822]: 2025-09-30 18:32:02.622 2 INFO nova.scheduler.client.report [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Deleted allocation for migration 1f9d60b1-2650-404b-96aa-1154ab475694
Sep 30 18:32:02 compute-1 nova_compute[238822]: 2025-09-30 18:32:02.623 2 DEBUG nova.virt.libvirt.driver [None req-9d63bf12-12ae-476e-ae19-3562886a5a07 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 5321575c-f6c1-4500-adf7-285c22df2e73] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 18:32:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:02.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:03 compute-1 ceph-mon[75484]: pgmap v1602: 353 pgs: 353 active+clean; 200 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 560 B/s rd, 0 op/s
Sep 30 18:32:03 compute-1 nova_compute[238822]: 2025-09-30 18:32:03.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:03.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:04.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:05 compute-1 ceph-mon[75484]: pgmap v1603: 353 pgs: 353 active+clean; 163 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1.2 KiB/s wr, 18 op/s
Sep 30 18:32:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:05.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:05 compute-1 podman[249638]: time="2025-09-30T18:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:32:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:32:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8352 "" "Go-http-client/1.1"
Sep 30 18:32:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:06.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:06 compute-1 nova_compute[238822]: 2025-09-30 18:32:06.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:07 compute-1 ceph-mon[75484]: pgmap v1604: 353 pgs: 353 active+clean; 121 MiB data, 352 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:32:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:32:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:32:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:07.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:32:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:08 compute-1 nova_compute[238822]: 2025-09-30 18:32:08.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:08.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:09 compute-1 ceph-mon[75484]: pgmap v1605: 353 pgs: 353 active+clean; 121 MiB data, 352 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:32:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:09.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:10.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:11 compute-1 ceph-mon[75484]: pgmap v1606: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:32:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1081377242' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:11.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:11 compute-1 nova_compute[238822]: 2025-09-30 18:32:11.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:12.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:13 compute-1 nova_compute[238822]: 2025-09-30 18:32:13.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:13 compute-1 ceph-mon[75484]: pgmap v1607: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:32:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:13.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:14.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:14 compute-1 nova_compute[238822]: 2025-09-30 18:32:14.849 2 DEBUG nova.compute.manager [None req-f309e6cc-4ea3-4f92-9ef2-f3f3605de32d e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 18:32:14 compute-1 nova_compute[238822]: 2025-09-30 18:32:14.908 2 DEBUG nova.compute.provider_tree [None req-f309e6cc-4ea3-4f92-9ef2-f3f3605de32d e33f9dc9fbb84319b00517567fe4b47e 4e2dde567e5c4b1c9802c64cfc281b6d - - default default] Updating resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a generation from 29 to 32 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 18:32:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:15 compute-1 ceph-mon[75484]: pgmap v1608: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:32:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:15.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:16 compute-1 sudo[289487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:32:16 compute-1 sudo[289487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:32:16 compute-1 sudo[289487]: pam_unix(sudo:session): session closed for user root
Sep 30 18:32:16 compute-1 podman[289512]: 2025-09-30 18:32:16.539168689 +0000 UTC m=+0.093959491 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:32:16 compute-1 ceph-mon[75484]: pgmap v1609: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 26 KiB/s rd, 1.2 KiB/s wr, 37 op/s
Sep 30 18:32:16 compute-1 podman[289511]: 2025-09-30 18:32:16.592782098 +0000 UTC m=+0.149762269 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 18:32:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:16.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:16 compute-1 nova_compute[238822]: 2025-09-30 18:32:16.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:17.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:17 compute-1 unix_chkpwd[289565]: password check failed for user (root)
Sep 30 18:32:17 compute-1 sshd-session[289485]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:32:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:18 compute-1 sshd-session[289563]: Invalid user minecraft from 8.243.64.201 port 36270
Sep 30 18:32:18 compute-1 sshd-session[289563]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:32:18 compute-1 sshd-session[289563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:32:18 compute-1 nova_compute[238822]: 2025-09-30 18:32:18.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:18 compute-1 ceph-mon[75484]: pgmap v1610: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:32:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:18.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: ERROR   18:32:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: ERROR   18:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: ERROR   18:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: ERROR   18:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: ERROR   18:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:32:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:32:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:19.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:19 compute-1 sshd-session[289485]: Failed password for root from 192.210.160.141 port 54668 ssh2
Sep 30 18:32:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:20 compute-1 sshd-session[289563]: Failed password for invalid user minecraft from 8.243.64.201 port 36270 ssh2
Sep 30 18:32:20 compute-1 ceph-mon[75484]: pgmap v1611: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:32:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2254099041' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:32:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:20.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:32:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:20 compute-1 sshd-session[289485]: Connection closed by authenticating user root 192.210.160.141 port 54668 [preauth]
Sep 30 18:32:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:21.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:21 compute-1 nova_compute[238822]: 2025-09-30 18:32:21.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:22 compute-1 sshd-session[289563]: Received disconnect from 8.243.64.201 port 36270:11: Bye Bye [preauth]
Sep 30 18:32:22 compute-1 sshd-session[289563]: Disconnected from invalid user minecraft 8.243.64.201 port 36270 [preauth]
Sep 30 18:32:22 compute-1 ceph-mon[75484]: pgmap v1612: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:32:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:32:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:22.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:23 compute-1 nova_compute[238822]: 2025-09-30 18:32:23.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:23 compute-1 podman[289573]: 2025-09-30 18:32:23.551089875 +0000 UTC m=+0.087634279 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:32:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:24 compute-1 ceph-mon[75484]: pgmap v1613: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:32:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:24.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:24 compute-1 unix_chkpwd[289597]: password check failed for user (root)
Sep 30 18:32:24 compute-1 sshd-session[289593]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:32:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:32:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:25.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:32:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:26.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:26 compute-1 ceph-mon[75484]: pgmap v1614: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:32:26 compute-1 unix_chkpwd[289602]: password check failed for user (root)
Sep 30 18:32:26 compute-1 sshd-session[289598]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:32:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:26 compute-1 nova_compute[238822]: 2025-09-30 18:32:26.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:27 compute-1 sshd-session[289593]: Failed password for root from 14.225.167.110 port 60506 ssh2
Sep 30 18:32:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:27.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:27 compute-1 sshd-session[289593]: Received disconnect from 14.225.167.110 port 60506:11: Bye Bye [preauth]
Sep 30 18:32:27 compute-1 sshd-session[289593]: Disconnected from authenticating user root 14.225.167.110 port 60506 [preauth]
Sep 30 18:32:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:28 compute-1 nova_compute[238822]: 2025-09-30 18:32:28.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:28.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:28 compute-1 ceph-mon[75484]: pgmap v1615: 353 pgs: 353 active+clean; 41 MiB data, 304 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:32:28 compute-1 sshd-session[289598]: Failed password for root from 216.10.242.161 port 34604 ssh2
Sep 30 18:32:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:29.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/921557445' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:32:29 compute-1 sshd-session[289598]: Received disconnect from 216.10.242.161 port 34604:11: Bye Bye [preauth]
Sep 30 18:32:29 compute-1 sshd-session[289598]: Disconnected from authenticating user root 216.10.242.161 port 34604 [preauth]
Sep 30 18:32:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:30 compute-1 podman[289606]: 2025-09-30 18:32:30.564441242 +0000 UTC m=+0.101968537 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:32:30 compute-1 podman[289607]: 2025-09-30 18:32:30.57213002 +0000 UTC m=+0.106282234 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Sep 30 18:32:30 compute-1 podman[289608]: 2025-09-30 18:32:30.587826574 +0000 UTC m=+0.113531199 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:32:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:30.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:30 compute-1 ceph-mon[75484]: pgmap v1616: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:32:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4270793738' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:32:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:31.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:32 compute-1 nova_compute[238822]: 2025-09-30 18:32:32.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:32.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:32 compute-1 ceph-mon[75484]: pgmap v1617: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:32:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:33 compute-1 nova_compute[238822]: 2025-09-30 18:32:33.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:32:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:33.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:32:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:34.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:34 compute-1 ceph-mon[75484]: pgmap v1618: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:32:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:35.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:35 compute-1 podman[249638]: time="2025-09-30T18:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:32:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:32:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:32:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:36 compute-1 sudo[289670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:32:36 compute-1 sudo[289670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:32:36 compute-1 sudo[289670]: pam_unix(sudo:session): session closed for user root
Sep 30 18:32:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:36.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:36 compute-1 ceph-mon[75484]: pgmap v1619: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Sep 30 18:32:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3187322812' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:32:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3187322812' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:32:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:37 compute-1 nova_compute[238822]: 2025-09-30 18:32:37.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:37.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:32:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:38 compute-1 nova_compute[238822]: 2025-09-30 18:32:38.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:38.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:38 compute-1 ceph-mon[75484]: pgmap v1620: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Sep 30 18:32:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:39.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:40 compute-1 nova_compute[238822]: 2025-09-30 18:32:40.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:40 compute-1 nova_compute[238822]: 2025-09-30 18:32:40.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:32:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:40 compute-1 ceph-mon[75484]: pgmap v1621: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:32:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:41 compute-1 nova_compute[238822]: 2025-09-30 18:32:41.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:41.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:42 compute-1 nova_compute[238822]: 2025-09-30 18:32:42.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:42.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:42 compute-1 ceph-mon[75484]: pgmap v1622: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:32:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.577 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:32:43 compute-1 nova_compute[238822]: 2025-09-30 18:32:43.577 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:32:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/988617778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:43 compute-1 sshd-session[289700]: Invalid user ryan from 192.210.160.141 port 56374
Sep 30 18:32:43 compute-1 sshd-session[289700]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:32:43 compute-1 sshd-session[289700]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:32:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:32:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3091147046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.060 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.302 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.304 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.337 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.339 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4732MB free_disk=39.971275329589844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.339 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:32:44 compute-1 nova_compute[238822]: 2025-09-30 18:32:44.340 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:32:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:44.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:44 compute-1 ceph-mon[75484]: pgmap v1623: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:32:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3091147046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2521701379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:45.159 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:45.164 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:32:45 compute-1 sshd-session[289700]: Failed password for invalid user ryan from 192.210.160.141 port 56374 ssh2
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.391 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.392 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:32:44 up  4:10,  0 user,  load average: 0.22, 0.35, 0.61\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.407 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:32:45 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Sep 30 18:32:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:32:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:45.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:32:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:32:45 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1948820474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.915 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:32:45 compute-1 nova_compute[238822]: 2025-09-30 18:32:45.920 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:32:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1948820474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:46 compute-1 nova_compute[238822]: 2025-09-30 18:32:46.429 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:32:46 compute-1 sshd-session[289700]: Connection closed by invalid user ryan 192.210.160.141 port 56374 [preauth]
Sep 30 18:32:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:46.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:46 compute-1 ceph-mon[75484]: pgmap v1624: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:32:46 compute-1 nova_compute[238822]: 2025-09-30 18:32:46.942 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:32:46 compute-1 nova_compute[238822]: 2025-09-30 18:32:46.943 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.604s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:32:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:47 compute-1 nova_compute[238822]: 2025-09-30 18:32:47.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:47 compute-1 podman[289755]: 2025-09-30 18:32:47.539444899 +0000 UTC m=+0.079563642 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:32:47 compute-1 podman[289754]: 2025-09-30 18:32:47.588312749 +0000 UTC m=+0.129274495 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 18:32:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:32:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/717575760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:47.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/717575760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:32:48 compute-1 nova_compute[238822]: 2025-09-30 18:32:48.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:48.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:48 compute-1 ceph-mon[75484]: pgmap v1625: 353 pgs: 353 active+clean; 88 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 72 op/s
Sep 30 18:32:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: ERROR   18:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: ERROR   18:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: ERROR   18:32:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: ERROR   18:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: ERROR   18:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:32:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:32:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:49.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:50.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:50 compute-1 nova_compute[238822]: 2025-09-30 18:32:50.944 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:50 compute-1 nova_compute[238822]: 2025-09-30 18:32:50.945 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:50 compute-1 nova_compute[238822]: 2025-09-30 18:32:50.945 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:50 compute-1 ceph-mon[75484]: pgmap v1626: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 161 op/s
Sep 30 18:32:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:51.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:51 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:32:52 compute-1 nova_compute[238822]: 2025-09-30 18:32:52.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:52 compute-1 nova_compute[238822]: 2025-09-30 18:32:52.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:53 compute-1 ceph-mon[75484]: pgmap v1627: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 323 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:32:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:32:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2430757105' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:32:53 compute-1 nova_compute[238822]: 2025-09-30 18:32:53.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:32:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:53.165 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:32:53 compute-1 nova_compute[238822]: 2025-09-30 18:32:53.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:53.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2036967863' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:32:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:54.396 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:32:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:54.396 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:32:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:32:54.396 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:32:54 compute-1 podman[289812]: 2025-09-30 18:32:54.551688764 +0000 UTC m=+0.083589990 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 18:32:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:54.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:55 compute-1 ceph-mon[75484]: pgmap v1628: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 323 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:32:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:32:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:55.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:32:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:56 compute-1 sudo[289835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:32:56 compute-1 sudo[289835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:32:56 compute-1 sudo[289835]: pam_unix(sudo:session): session closed for user root
Sep 30 18:32:56 compute-1 sudo[289858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:32:56 compute-1 sudo[289858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:32:56 compute-1 sudo[289858]: pam_unix(sudo:session): session closed for user root
Sep 30 18:32:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:56.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:56 compute-1 sudo[289885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:32:56 compute-1 sudo[289885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:32:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:57 compute-1 ceph-mon[75484]: pgmap v1629: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 323 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:32:57 compute-1 nova_compute[238822]: 2025-09-30 18:32:57.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:57 compute-1 sudo[289885]: pam_unix(sudo:session): session closed for user root
Sep 30 18:32:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:57.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: pgmap v1630: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3666147916' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:32:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3666147916' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:32:58 compute-1 nova_compute[238822]: 2025-09-30 18:32:58.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:32:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:32:58.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:32:59 compute-1 ceph-mon[75484]: pgmap v1631: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 335 KiB/s rd, 3.9 MiB/s wr, 100 op/s
Sep 30 18:32:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:32:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:32:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:32:59.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:32:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:32:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:32:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:32:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:00.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:01 compute-1 podman[289948]: 2025-09-30 18:33:01.553437006 +0000 UTC m=+0.089998884 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 18:33:01 compute-1 podman[289949]: 2025-09-30 18:33:01.572293515 +0000 UTC m=+0.102490621 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git)
Sep 30 18:33:01 compute-1 podman[289950]: 2025-09-30 18:33:01.576769416 +0000 UTC m=+0.097273200 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:33:01 compute-1 ceph-mon[75484]: pgmap v1632: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 26 KiB/s wr, 11 op/s
Sep 30 18:33:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:01.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:02 compute-1 nova_compute[238822]: 2025-09-30 18:33:02.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:02 compute-1 sudo[290008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:33:02 compute-1 sudo[290008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:33:02 compute-1 sudo[290008]: pam_unix(sudo:session): session closed for user root
Sep 30 18:33:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:02.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:33:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:33:03 compute-1 nova_compute[238822]: 2025-09-30 18:33:03.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:33:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:03.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:33:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:04 compute-1 ceph-mon[75484]: pgmap v1633: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 73 op/s
Sep 30 18:33:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:04.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:05 compute-1 ceph-mon[75484]: pgmap v1634: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 75 op/s
Sep 30 18:33:05 compute-1 podman[249638]: time="2025-09-30T18:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:33:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:33:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8352 "" "Go-http-client/1.1"
Sep 30 18:33:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:05.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:06.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:07 compute-1 nova_compute[238822]: 2025-09-30 18:33:07.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:33:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:33:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:07.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:33:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:08 compute-1 ceph-mon[75484]: pgmap v1635: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:33:08 compute-1 nova_compute[238822]: 2025-09-30 18:33:08.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:08.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:09 compute-1 unix_chkpwd[290042]: password check failed for user (root)
Sep 30 18:33:09 compute-1 sshd-session[290038]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:33:09 compute-1 ceph-mon[75484]: pgmap v1636: 353 pgs: 353 active+clean; 186 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 96 op/s
Sep 30 18:33:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:09.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:10.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:11 compute-1 sshd-session[290038]: Failed password for root from 192.210.160.141 port 55142 ssh2
Sep 30 18:33:11 compute-1 ceph-mon[75484]: pgmap v1637: 353 pgs: 353 active+clean; 186 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.7 MiB/s wr, 86 op/s
Sep 30 18:33:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:11.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:12 compute-1 nova_compute[238822]: 2025-09-30 18:33:12.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:12 compute-1 sshd-session[290038]: Connection closed by authenticating user root 192.210.160.141 port 55142 [preauth]
Sep 30 18:33:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:12.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:13 compute-1 nova_compute[238822]: 2025-09-30 18:33:13.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:13 compute-1 ceph-mon[75484]: pgmap v1638: 353 pgs: 353 active+clean; 196 MiB data, 412 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 114 op/s
Sep 30 18:33:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:13.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:14.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:15 compute-1 ceph-mon[75484]: pgmap v1639: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 386 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:33:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:15.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:16 compute-1 sudo[290050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:33:16 compute-1 sudo[290050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:33:16 compute-1 sudo[290050]: pam_unix(sudo:session): session closed for user root
Sep 30 18:33:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:33:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:16.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:33:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:17 compute-1 nova_compute[238822]: 2025-09-30 18:33:17.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:17 compute-1 ceph-mon[75484]: pgmap v1640: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:33:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:17.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:18 compute-1 podman[290078]: 2025-09-30 18:33:18.550385769 +0000 UTC m=+0.083227111 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:33:18 compute-1 nova_compute[238822]: 2025-09-30 18:33:18.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:18 compute-1 podman[290077]: 2025-09-30 18:33:18.617333258 +0000 UTC m=+0.157942890 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:33:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:18.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: ERROR   18:33:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: ERROR   18:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: ERROR   18:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: ERROR   18:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: ERROR   18:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:33:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:33:19 compute-1 ceph-mon[75484]: pgmap v1641: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:33:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:19.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:20.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:21 compute-1 ceph-mon[75484]: pgmap v1642: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 296 KiB/s rd, 404 KiB/s wr, 42 op/s
Sep 30 18:33:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:21.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:22 compute-1 nova_compute[238822]: 2025-09-30 18:33:22.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:33:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:22.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:23 compute-1 nova_compute[238822]: 2025-09-30 18:33:23.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:23 compute-1 ceph-mon[75484]: pgmap v1643: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 296 KiB/s rd, 404 KiB/s wr, 42 op/s
Sep 30 18:33:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:23.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:24.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:25 compute-1 podman[290138]: 2025-09-30 18:33:25.53177407 +0000 UTC m=+0.078572315 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 18:33:25 compute-1 ceph-mon[75484]: pgmap v1644: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 48 KiB/s rd, 31 KiB/s wr, 15 op/s
Sep 30 18:33:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:25.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:25 compute-1 sshd-session[290136]: Invalid user superset from 8.243.64.201 port 59092
Sep 30 18:33:25 compute-1 sshd-session[290136]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:33:25 compute-1 sshd-session[290136]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:33:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:26.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:27 compute-1 ovn_controller[135204]: 2025-09-30T18:33:27Z|00198|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Sep 30 18:33:27 compute-1 sshd-session[290136]: Failed password for invalid user superset from 8.243.64.201 port 59092 ssh2
Sep 30 18:33:27 compute-1 nova_compute[238822]: 2025-09-30 18:33:27.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:27 compute-1 sshd-session[290136]: Received disconnect from 8.243.64.201 port 59092:11: Bye Bye [preauth]
Sep 30 18:33:27 compute-1 sshd-session[290136]: Disconnected from invalid user superset 8.243.64.201 port 59092 [preauth]
Sep 30 18:33:27 compute-1 ceph-mon[75484]: pgmap v1645: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:33:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:27.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:28 compute-1 nova_compute[238822]: 2025-09-30 18:33:28.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:28.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:29 compute-1 ceph-mon[75484]: pgmap v1646: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:33:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:29.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:30 compute-1 sshd-session[290163]: Invalid user consulta1 from 216.10.242.161 port 40586
Sep 30 18:33:30 compute-1 sshd-session[290163]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:33:30 compute-1 sshd-session[290163]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:33:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:30.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:31 compute-1 ceph-mon[75484]: pgmap v1647: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:33:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:31.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:32 compute-1 sshd-session[290163]: Failed password for invalid user consulta1 from 216.10.242.161 port 40586 ssh2
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.264 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Creating tmpfile /var/lib/nova/instances/tmpwp_k9oqd to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.265 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.271 2 DEBUG nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwp_k9oqd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.462 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Creating tmpfile /var/lib/nova/instances/tmp3f_adtk9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.463 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:32 compute-1 nova_compute[238822]: 2025-09-30 18:33:32.468 2 DEBUG nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3f_adtk9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:33:32 compute-1 podman[290168]: 2025-09-30 18:33:32.569212007 +0000 UTC m=+0.098012511 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 18:33:32 compute-1 podman[290169]: 2025-09-30 18:33:32.580948494 +0000 UTC m=+0.103260672 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:33:32 compute-1 podman[290170]: 2025-09-30 18:33:32.612157087 +0000 UTC m=+0.125934305 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:33:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:32.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:33 compute-1 nova_compute[238822]: 2025-09-30 18:33:33.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:33 compute-1 ceph-mon[75484]: pgmap v1648: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:33:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:33.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:34 compute-1 nova_compute[238822]: 2025-09-30 18:33:34.334 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:34 compute-1 nova_compute[238822]: 2025-09-30 18:33:34.500 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:34 compute-1 sshd-session[290163]: Received disconnect from 216.10.242.161 port 40586:11: Bye Bye [preauth]
Sep 30 18:33:34 compute-1 sshd-session[290163]: Disconnected from invalid user consulta1 216.10.242.161 port 40586 [preauth]
Sep 30 18:33:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:34.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:35 compute-1 ceph-mon[75484]: pgmap v1649: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:33:35 compute-1 podman[249638]: time="2025-09-30T18:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:33:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:33:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8353 "" "Go-http-client/1.1"
Sep 30 18:33:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:35.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:36 compute-1 unix_chkpwd[290235]: password check failed for user (root)
Sep 30 18:33:36 compute-1 sshd-session[290231]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:33:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3722427631' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:33:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3722427631' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:33:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:36.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:36 compute-1 sudo[290236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:33:36 compute-1 sudo[290236]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:33:36 compute-1 sudo[290236]: pam_unix(sudo:session): session closed for user root
Sep 30 18:33:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:37 compute-1 nova_compute[238822]: 2025-09-30 18:33:37.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:33:37 compute-1 ceph-mon[75484]: pgmap v1650: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:33:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:37.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:38 compute-1 nova_compute[238822]: 2025-09-30 18:33:38.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:38.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:38 compute-1 sshd-session[290231]: Failed password for root from 192.210.160.141 port 59102 ssh2
Sep 30 18:33:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:39 compute-1 ceph-mon[75484]: pgmap v1651: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:33:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:39.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:39 compute-1 sshd-session[290231]: Connection closed by authenticating user root 192.210.160.141 port 59102 [preauth]
Sep 30 18:33:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:40 compute-1 nova_compute[238822]: 2025-09-30 18:33:40.051 2 DEBUG nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwp_k9oqd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='78e1566d-9c5e-49b1-a044-0c46cf002c66',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:33:40 compute-1 sshd-session[290264]: Invalid user halo from 14.225.167.110 port 39182
Sep 30 18:33:40 compute-1 sshd-session[290264]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:33:40 compute-1 sshd-session[290264]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:33:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:40.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:41 compute-1 nova_compute[238822]: 2025-09-30 18:33:41.066 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:33:41 compute-1 nova_compute[238822]: 2025-09-30 18:33:41.067 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:33:41 compute-1 nova_compute[238822]: 2025-09-30 18:33:41.067 2 DEBUG nova.network.neutron [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:33:41 compute-1 nova_compute[238822]: 2025-09-30 18:33:41.577 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:41 compute-1 ceph-mon[75484]: pgmap v1652: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:33:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:41.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:42 compute-1 nova_compute[238822]: 2025-09-30 18:33:42.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:42 compute-1 nova_compute[238822]: 2025-09-30 18:33:42.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:42 compute-1 nova_compute[238822]: 2025-09-30 18:33:42.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:33:42 compute-1 sshd-session[290264]: Failed password for invalid user halo from 14.225.167.110 port 39182 ssh2
Sep 30 18:33:42 compute-1 nova_compute[238822]: 2025-09-30 18:33:42.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:42.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:43 compute-1 nova_compute[238822]: 2025-09-30 18:33:43.387 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:43 compute-1 sshd-session[290264]: Received disconnect from 14.225.167.110 port 39182:11: Bye Bye [preauth]
Sep 30 18:33:43 compute-1 sshd-session[290264]: Disconnected from invalid user halo 14.225.167.110 port 39182 [preauth]
Sep 30 18:33:43 compute-1 nova_compute[238822]: 2025-09-30 18:33:43.552 2 DEBUG nova.network.neutron [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Updating instance_info_cache with network_info: [{"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:33:43 compute-1 ceph-mon[75484]: pgmap v1653: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.4 KiB/s wr, 2 op/s
Sep 30 18:33:43 compute-1 nova_compute[238822]: 2025-09-30 18:33:43.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:43.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:43 compute-1 sshd-session[290269]: Invalid user habib from 103.153.190.105 port 35717
Sep 30 18:33:43 compute-1 sshd-session[290269]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:33:43 compute-1 sshd-session[290269]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:33:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.059 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.072 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwp_k9oqd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='78e1566d-9c5e-49b1-a044-0c46cf002c66',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.073 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Creating instance directory: /var/lib/nova/instances/78e1566d-9c5e-49b1-a044-0c46cf002c66 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.074 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Ensure instance console log exists: /var/lib/nova/instances/78e1566d-9c5e-49b1-a044-0c46cf002c66/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.074 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.076 2 DEBUG nova.virt.libvirt.vif [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-760306639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-760306639',id=24,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:32:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-7ntnt7t9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:32:35Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=78e1566d-9c5e-49b1-a044-0c46cf002c66,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.076 2 DEBUG nova.network.os_vif_util [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.077 2 DEBUG nova.network.os_vif_util [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.078 2 DEBUG os_vif [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5094654b-b05d-5aeb-b948-dd78faf9fed3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.090 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48ba4743-59, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap48ba4743-59, col_values=(('qos', UUID('5e451177-af5d-4ae5-8e22-e60110d9117c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap48ba4743-59, col_values=(('external_ids', {'iface-id': '48ba4743-596d-47a6-a246-afe70e6e1fc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:e6:35', 'vm-uuid': '78e1566d-9c5e-49b1-a044-0c46cf002c66'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 NetworkManager[45549]: <info>  [1759257224.0945] manager: (tap48ba4743-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.103 2 INFO os_vif [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59')
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.104 2 DEBUG nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.104 2 DEBUG nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwp_k9oqd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='78e1566d-9c5e-49b1-a044-0c46cf002c66',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.105 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.551 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:44 compute-1 nova_compute[238822]: 2025-09-30 18:33:44.566 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:44.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.079 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.079 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.080 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.080 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.081 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:33:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2498694099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:33:45 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/885221029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.529 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:45.741 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:33:45 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:45.743 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.747 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.749 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:33:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:45.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.783 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.784 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4758MB free_disk=39.901153564453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.784 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:33:45 compute-1 nova_compute[238822]: 2025-09-30 18:33:45.785 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:33:45 compute-1 sshd-session[290269]: Failed password for invalid user habib from 103.153.190.105 port 35717 ssh2
Sep 30 18:33:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/885221029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:46 compute-1 ceph-mon[75484]: pgmap v1654: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 10 KiB/s wr, 2 op/s
Sep 30 18:33:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:46.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:46 compute-1 nova_compute[238822]: 2025-09-30 18:33:46.966 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 29a2fe9a-add5-43c1-948a-9df854aa4261 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:33:46 compute-1 nova_compute[238822]: 2025-09-30 18:33:46.967 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 78e1566d-9c5e-49b1-a044-0c46cf002c66 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:33:47 compute-1 nova_compute[238822]: 2025-09-30 18:33:47.592 2 DEBUG nova.network.neutron [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Port 48ba4743-596d-47a6-a246-afe70e6e1fc6 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:33:47 compute-1 nova_compute[238822]: 2025-09-30 18:33:47.604 2 DEBUG nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwp_k9oqd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='78e1566d-9c5e-49b1-a044-0c46cf002c66',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:33:47 compute-1 ceph-mon[75484]: pgmap v1655: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:33:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:47.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:47 compute-1 nova_compute[238822]: 2025-09-30 18:33:47.982 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Updating resource usage from migration 3076aa2f-e697-4bdd-98f3-898820dd8d4b
Sep 30 18:33:47 compute-1 nova_compute[238822]: 2025-09-30 18:33:47.983 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Starting to track incoming migration 3076aa2f-e697-4bdd-98f3-898820dd8d4b with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:33:48 compute-1 sshd-session[290269]: Received disconnect from 103.153.190.105 port 35717:11: Bye Bye [preauth]
Sep 30 18:33:48 compute-1 sshd-session[290269]: Disconnected from invalid user habib 103.153.190.105 port 35717 [preauth]
Sep 30 18:33:48 compute-1 nova_compute[238822]: 2025-09-30 18:33:48.493 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Updating resource usage from migration 6fbb2434-2a54-4726-8c0b-62fb0288d275
Sep 30 18:33:48 compute-1 nova_compute[238822]: 2025-09-30 18:33:48.495 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Starting to track incoming migration 6fbb2434-2a54-4726-8c0b-62fb0288d275 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:33:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2685991844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:48 compute-1 nova_compute[238822]: 2025-09-30 18:33:48.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:48.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:49 compute-1 nova_compute[238822]: 2025-09-30 18:33:49.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: ERROR   18:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: ERROR   18:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: ERROR   18:33:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: ERROR   18:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: ERROR   18:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:33:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:33:49 compute-1 nova_compute[238822]: 2025-09-30 18:33:49.548 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 78e1566d-9c5e-49b1-a044-0c46cf002c66 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:33:49 compute-1 podman[290306]: 2025-09-30 18:33:49.550379089 +0000 UTC m=+0.079096119 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:33:49 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:33:49 compute-1 ceph-mon[75484]: pgmap v1656: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:33:49 compute-1 podman[290304]: 2025-09-30 18:33:49.643493386 +0000 UTC m=+0.168884436 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:33:49 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.745 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:49.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:49 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:33:49 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:33:49 compute-1 kernel: tap48ba4743-59: entered promiscuous mode
Sep 30 18:33:49 compute-1 NetworkManager[45549]: <info>  [1759257229.8929] manager: (tap48ba4743-59): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Sep 30 18:33:49 compute-1 nova_compute[238822]: 2025-09-30 18:33:49.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:49 compute-1 ovn_controller[135204]: 2025-09-30T18:33:49Z|00199|binding|INFO|Claiming lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 for this additional chassis.
Sep 30 18:33:49 compute-1 ovn_controller[135204]: 2025-09-30T18:33:49Z|00200|binding|INFO|48ba4743-596d-47a6-a246-afe70e6e1fc6: Claiming fa:16:3e:47:e6:35 10.100.0.12
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.906 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:e6:35 10.100.0.12'], port_security=['fa:16:3e:47:e6:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '78e1566d-9c5e-49b1-a044-0c46cf002c66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=48ba4743-596d-47a6-a246-afe70e6e1fc6) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.908 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 48ba4743-596d-47a6-a246-afe70e6e1fc6 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.910 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.927 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[73aad9e3-1711-4c5e-8a63-118214a231b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.928 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.931 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.931 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1655fc55-68b8-405f-a208-3d2c96747112]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:49 compute-1 nova_compute[238822]: 2025-09-30 18:33:49.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.932 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a65f977c-057d-4b9b-8997-feb33421aaf5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:49 compute-1 ovn_controller[135204]: 2025-09-30T18:33:49Z|00201|binding|INFO|Setting lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 ovn-installed in OVS
Sep 30 18:33:49 compute-1 nova_compute[238822]: 2025-09-30 18:33:49.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.956 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5adc97-d595-4708-8766-fc8dd4a501be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:49 compute-1 systemd-machined[195911]: New machine qemu-18-instance-00000018.
Sep 30 18:33:49 compute-1 systemd-udevd[290405]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:33:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:49.976 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[149bfd12-8446-4d71-bedc-21ef55e02471]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:49 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Sep 30 18:33:49 compute-1 NetworkManager[45549]: <info>  [1759257229.9921] device (tap48ba4743-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:33:49 compute-1 NetworkManager[45549]: <info>  [1759257229.9942] device (tap48ba4743-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.024 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d58bba76-7aaa-40d1-b96f-c5f5d54137d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.030 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9493e2f8-650e-47aa-8773-f0efc3735a7c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 systemd-udevd[290408]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:33:50 compute-1 NetworkManager[45549]: <info>  [1759257230.0324] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.058 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 29a2fe9a-add5-43c1-948a-9df854aa4261 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.059 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.059 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:33:45 up  4:11,  0 user,  load average: 0.08, 0.28, 0.57\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.087 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c103863a-beb4-4e24-ac47-693cc86c5f5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.092 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[91c62e6a-5e75-4ba2-99bf-92071ff21543]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.112 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:33:50 compute-1 NetworkManager[45549]: <info>  [1759257230.1487] device (tap6901f664-30): carrier: link connected
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.164 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[22f350e3-9b00-4dfb-b69e-752a1c199830]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.205 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2c23831d-1239-429b-93f2-9a5c48d929b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1507172, 'reachable_time': 21460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290437, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.232 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa3bbf4-2bd1-4bfd-8158-165fe241e355]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1507172, 'tstamp': 1507172}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290438, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.265 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[582c5908-13fb-4065-8446-8ec589f3dd4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1507172, 'reachable_time': 21460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 290439, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.329 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8b35f8-2f51-463b-b240-a789071fc6c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.434 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5d3de3-b57f-4aa4-8dcc-439461bd1a27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.436 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.437 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.437 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:50 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:33:50 compute-1 NetworkManager[45549]: <info>  [1759257230.4406] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.450 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:33:50 compute-1 ovn_controller[135204]: 2025-09-30T18:33:50Z|00202|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.453 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[25d029b4-9c57-4e0b-9b59-ba900c0fbcb3]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.455 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.455 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.455 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.456 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.457 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[88568a23-f005-4bc8-bd3e-3c5fef28f847]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.463 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.464 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8325c5-62f7-47c4-8242-53663156bfda]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.465 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:33:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:50.466 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:33:50 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3464279147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.619 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:33:50 compute-1 nova_compute[238822]: 2025-09-30 18:33:50.626 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:33:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3464279147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:33:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:50.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:50 compute-1 podman[290535]: 2025-09-30 18:33:50.920997398 +0000 UTC m=+0.073423365 container create 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:33:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:50 compute-1 podman[290535]: 2025-09-30 18:33:50.880871484 +0000 UTC m=+0.033297441 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:33:50 compute-1 systemd[1]: Started libpod-conmon-23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4.scope.
Sep 30 18:33:51 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:33:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccfd5cdc78e07f71e4fce4c1196b9bcbc4a5279dd3c4c929e9ff9c2d7197ced9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:33:51 compute-1 podman[290535]: 2025-09-30 18:33:51.044120236 +0000 UTC m=+0.196546203 container init 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:33:51 compute-1 podman[290535]: 2025-09-30 18:33:51.055233097 +0000 UTC m=+0.207659054 container start 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 18:33:51 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [NOTICE]   (290555) : New worker (290557) forked
Sep 30 18:33:51 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [NOTICE]   (290555) : Loading success.
Sep 30 18:33:51 compute-1 nova_compute[238822]: 2025-09-30 18:33:51.137 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:33:51 compute-1 nova_compute[238822]: 2025-09-30 18:33:51.650 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:33:51 compute-1 nova_compute[238822]: 2025-09-30 18:33:51.651 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.866s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:33:51 compute-1 ceph-mon[75484]: pgmap v1657: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:33:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:51.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:33:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:52.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:52 compute-1 ovn_controller[135204]: 2025-09-30T18:33:52Z|00203|binding|INFO|Claiming lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 for this chassis.
Sep 30 18:33:52 compute-1 ovn_controller[135204]: 2025-09-30T18:33:52Z|00204|binding|INFO|48ba4743-596d-47a6-a246-afe70e6e1fc6: Claiming fa:16:3e:47:e6:35 10.100.0.12
Sep 30 18:33:52 compute-1 ovn_controller[135204]: 2025-09-30T18:33:52Z|00205|binding|INFO|Setting lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 up in Southbound
Sep 30 18:33:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:53 compute-1 nova_compute[238822]: 2025-09-30 18:33:53.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:53 compute-1 ceph-mon[75484]: pgmap v1658: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 1023 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:33:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:53.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:53 compute-1 nova_compute[238822]: 2025-09-30 18:33:53.985 2 INFO nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Post operation of migration started
Sep 30 18:33:53 compute-1 nova_compute[238822]: 2025-09-30 18:33:53.986 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.142 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.143 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.143 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.144 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.144 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:33:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:54.397 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:33:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:54.397 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:33:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:33:54.398 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.559 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.560 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.644 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.645 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:33:54 compute-1 nova_compute[238822]: 2025-09-30 18:33:54.645 2 DEBUG nova.network.neutron [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:33:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:54.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:55 compute-1 nova_compute[238822]: 2025-09-30 18:33:55.155 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:55 compute-1 ceph-mon[75484]: pgmap v1659: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 1.2 KiB/s wr, 6 op/s
Sep 30 18:33:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:55.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:55 compute-1 nova_compute[238822]: 2025-09-30 18:33:55.945 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:33:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:56 compute-1 nova_compute[238822]: 2025-09-30 18:33:56.099 2 DEBUG nova.network.neutron [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Updating instance_info_cache with network_info: [{"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:33:56 compute-1 podman[290572]: 2025-09-30 18:33:56.546071758 +0000 UTC m=+0.085141842 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 18:33:56 compute-1 nova_compute[238822]: 2025-09-30 18:33:56.607 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-78e1566d-9c5e-49b1-a044-0c46cf002c66" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:33:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:33:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:56.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:33:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:56 compute-1 sudo[290594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:33:56 compute-1 sudo[290594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:33:56 compute-1 sudo[290594]: pam_unix(sudo:session): session closed for user root
Sep 30 18:33:57 compute-1 nova_compute[238822]: 2025-09-30 18:33:57.134 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:33:57 compute-1 nova_compute[238822]: 2025-09-30 18:33:57.136 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:33:57 compute-1 nova_compute[238822]: 2025-09-30 18:33:57.136 2 DEBUG oslo_concurrency.lockutils [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:33:57 compute-1 nova_compute[238822]: 2025-09-30 18:33:57.142 2 INFO nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:33:57 compute-1 virtqemud[239124]: Domain id=18 name='instance-00000018' uuid=78e1566d-9c5e-49b1-a044-0c46cf002c66 is tainted: custom-monitor
Sep 30 18:33:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3502070360' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:33:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3502070360' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:33:57 compute-1 ceph-mon[75484]: pgmap v1660: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:33:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:58 compute-1 nova_compute[238822]: 2025-09-30 18:33:58.152 2 INFO nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:33:58 compute-1 nova_compute[238822]: 2025-09-30 18:33:58.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:33:58.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:33:59 compute-1 nova_compute[238822]: 2025-09-30 18:33:59.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:33:59 compute-1 nova_compute[238822]: 2025-09-30 18:33:59.160 2 INFO nova.virt.libvirt.driver [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:33:59 compute-1 nova_compute[238822]: 2025-09-30 18:33:59.166 2 DEBUG nova.compute.manager [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:33:59 compute-1 ceph-mon[75484]: pgmap v1661: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:33:59 compute-1 nova_compute[238822]: 2025-09-30 18:33:59.678 2 DEBUG nova.objects.instance [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:33:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:33:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:33:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:33:59.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:33:59 compute-1 sshd-session[290305]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:33:59 compute-1 sshd-session[290305]: banner exchange: Connection from 110.42.70.108 port 59976: Connection timed out
Sep 30 18:33:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:33:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:33:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:33:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:00 compute-1 nova_compute[238822]: 2025-09-30 18:34:00.700 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:00.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:01 compute-1 nova_compute[238822]: 2025-09-30 18:34:01.528 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:01 compute-1 nova_compute[238822]: 2025-09-30 18:34:01.529 2 WARNING neutronclient.v2_0.client [None req-7087f7f3-1215-4d86-811d-00bd39ad11e9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:01 compute-1 ceph-mon[75484]: pgmap v1662: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:34:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:01.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:02 compute-1 sudo[290626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:34:02 compute-1 sshd-session[290622]: Invalid user enomor from 192.210.160.141 port 43198
Sep 30 18:34:02 compute-1 sudo[290626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:02 compute-1 sudo[290626]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:02 compute-1 sshd-session[290622]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:34:02 compute-1 sshd-session[290622]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:34:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:02.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:02 compute-1 sudo[290673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:34:02 compute-1 sudo[290673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:02 compute-1 podman[290650]: 2025-09-30 18:34:02.913951584 +0000 UTC m=+0.106120990 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=iscsid, tcib_build_tag=watcher_latest)
Sep 30 18:34:02 compute-1 podman[290652]: 2025-09-30 18:34:02.915243419 +0000 UTC m=+0.095201935 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=multipathd)
Sep 30 18:34:02 compute-1 podman[290651]: 2025-09-30 18:34:02.929687339 +0000 UTC m=+0.121567717 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Sep 30 18:34:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2211126201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:03 compute-1 sudo[290673]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:03 compute-1 nova_compute[238822]: 2025-09-30 18:34:03.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:03.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:04 compute-1 nova_compute[238822]: 2025-09-30 18:34:04.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:04 compute-1 ceph-mon[75484]: pgmap v1663: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.530383) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244530493, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2301, "num_deletes": 251, "total_data_size": 5675929, "memory_usage": 5759216, "flush_reason": "Manual Compaction"}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244554400, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3684699, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44358, "largest_seqno": 46654, "table_properties": {"data_size": 3675513, "index_size": 5680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19402, "raw_average_key_size": 20, "raw_value_size": 3657048, "raw_average_value_size": 3845, "num_data_blocks": 247, "num_entries": 951, "num_filter_entries": 951, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257041, "oldest_key_time": 1759257041, "file_creation_time": 1759257244, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 24299 microseconds, and 14003 cpu microseconds.
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.554688) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3684699 bytes OK
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.554787) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.557068) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.557092) EVENT_LOG_v1 {"time_micros": 1759257244557084, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.557123) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5665703, prev total WAL file size 5665703, number of live WAL files 2.
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.560267) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3598KB)], [87(11MB)]
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244560322, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 15442413, "oldest_snapshot_seqno": -1}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 6942 keys, 13530870 bytes, temperature: kUnknown
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244642047, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 13530870, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13486023, "index_size": 26367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 179901, "raw_average_key_size": 25, "raw_value_size": 13363054, "raw_average_value_size": 1924, "num_data_blocks": 1047, "num_entries": 6942, "num_filter_entries": 6942, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257244, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.642490) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 13530870 bytes
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.644473) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.6 rd, 165.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 11.2 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 7461, records dropped: 519 output_compression: NoCompression
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.644515) EVENT_LOG_v1 {"time_micros": 1759257244644489, "job": 54, "event": "compaction_finished", "compaction_time_micros": 81862, "compaction_time_cpu_micros": 52099, "output_level": 6, "num_output_files": 1, "total_output_size": 13530870, "num_input_records": 7461, "num_output_records": 6942, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244645881, "job": 54, "event": "table_file_deletion", "file_number": 89}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257244650167, "job": 54, "event": "table_file_deletion", "file_number": 87}
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.560163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.650330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.650341) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.650344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.650348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:34:04.650351) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:34:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:04.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:05 compute-1 sshd-session[290622]: Failed password for invalid user enomor from 192.210.160.141 port 43198 ssh2
Sep 30 18:34:05 compute-1 ceph-mon[75484]: pgmap v1664: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 170 B/s wr, 5 op/s
Sep 30 18:34:05 compute-1 podman[249638]: time="2025-09-30T18:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:34:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:34:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8829 "" "Go-http-client/1.1"
Sep 30 18:34:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:05.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:06 compute-1 sshd-session[290622]: Connection closed by invalid user enomor 192.210.160.141 port 43198 [preauth]
Sep 30 18:34:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:06.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2320034758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:07.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:07 compute-1 ceph-mon[75484]: pgmap v1665: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 556 B/s rd, 0 op/s
Sep 30 18:34:07 compute-1 ceph-mon[75484]: pgmap v1666: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 678 B/s rd, 0 op/s
Sep 30 18:34:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:34:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:08 compute-1 nova_compute[238822]: 2025-09-30 18:34:08.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:08.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:09 compute-1 nova_compute[238822]: 2025-09-30 18:34:09.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:09.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:09 compute-1 ceph-mon[75484]: pgmap v1667: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 339 B/s rd, 0 op/s
Sep 30 18:34:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:10 compute-1 nova_compute[238822]: 2025-09-30 18:34:10.439 2 DEBUG nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3f_adtk9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='29a2fe9a-add5-43c1-948a-9df854aa4261',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:34:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:10.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:11 compute-1 sudo[290776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:34:11 compute-1 sudo[290776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:11 compute-1 sudo[290776]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:11 compute-1 nova_compute[238822]: 2025-09-30 18:34:11.464 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:34:11 compute-1 nova_compute[238822]: 2025-09-30 18:34:11.464 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:34:11 compute-1 nova_compute[238822]: 2025-09-30 18:34:11.465 2 DEBUG nova.network.neutron [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:34:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:11 compute-1 nova_compute[238822]: 2025-09-30 18:34:11.978 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:12 compute-1 ceph-mon[75484]: pgmap v1668: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 679 B/s rd, 0 op/s
Sep 30 18:34:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:34:12 compute-1 nova_compute[238822]: 2025-09-30 18:34:12.449 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:12 compute-1 nova_compute[238822]: 2025-09-30 18:34:12.621 2 DEBUG nova.network.neutron [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Updating instance_info_cache with network_info: [{"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:12.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.130 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.159 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3f_adtk9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='29a2fe9a-add5-43c1-948a-9df854aa4261',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.161 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Creating instance directory: /var/lib/nova/instances/29a2fe9a-add5-43c1-948a-9df854aa4261 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.161 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Ensure instance console log exists: /var/lib/nova/instances/29a2fe9a-add5-43c1-948a-9df854aa4261/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.162 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.164 2 DEBUG nova.virt.libvirt.vif [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:32:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847336220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847336220',id=25,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:32:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-my5o1s4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:32:59Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=29a2fe9a-add5-43c1-948a-9df854aa4261,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.164 2 DEBUG nova.network.os_vif_util [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.165 2 DEBUG nova.network.os_vif_util [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.166 2 DEBUG os_vif [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.167 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.168 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1f8aa9c4-a74e-54b3-8225-6274dca4c305', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf942c9c9-85, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf942c9c9-85, col_values=(('qos', UUID('9f42b388-4cbf-45ea-a2cd-5b3a966c8ce6')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.179 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf942c9c9-85, col_values=(('external_ids', {'iface-id': 'f942c9c9-85a4-47cf-9428-7e266b83b49b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:2d:3d', 'vm-uuid': '29a2fe9a-add5-43c1-948a-9df854aa4261'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 NetworkManager[45549]: <info>  [1759257253.1821] manager: (tapf942c9c9-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.194 2 INFO os_vif [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85')
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.194 2 DEBUG nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.195 2 DEBUG nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3f_adtk9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='29a2fe9a-add5-43c1-948a-9df854aa4261',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.196 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.560 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:13.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:13 compute-1 nova_compute[238822]: 2025-09-30 18:34:13.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:14 compute-1 ceph-mon[75484]: pgmap v1669: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 678 B/s rd, 0 op/s
Sep 30 18:34:14 compute-1 nova_compute[238822]: 2025-09-30 18:34:14.542 2 DEBUG nova.network.neutron [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Port f942c9c9-85a4-47cf-9428-7e266b83b49b updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:34:14 compute-1 nova_compute[238822]: 2025-09-30 18:34:14.557 2 DEBUG nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3f_adtk9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='29a2fe9a-add5-43c1-948a-9df854aa4261',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:34:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:14.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:15.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:16 compute-1 ceph-mon[75484]: pgmap v1670: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 678 B/s rd, 11 KiB/s wr, 2 op/s
Sep 30 18:34:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:16.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:17 compute-1 sudo[290809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:34:17 compute-1 sudo[290809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:17 compute-1 sudo[290809]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:17 compute-1 kernel: tapf942c9c9-85: entered promiscuous mode
Sep 30 18:34:17 compute-1 NetworkManager[45549]: <info>  [1759257257.2290] manager: (tapf942c9c9-85): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Sep 30 18:34:17 compute-1 ovn_controller[135204]: 2025-09-30T18:34:17Z|00206|binding|INFO|Claiming lport f942c9c9-85a4-47cf-9428-7e266b83b49b for this additional chassis.
Sep 30 18:34:17 compute-1 ovn_controller[135204]: 2025-09-30T18:34:17Z|00207|binding|INFO|f942c9c9-85a4-47cf-9428-7e266b83b49b: Claiming fa:16:3e:2b:2d:3d 10.100.0.5
Sep 30 18:34:17 compute-1 nova_compute[238822]: 2025-09-30 18:34:17.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:17 compute-1 systemd-udevd[290845]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.275 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:2d:3d 10.100.0.5'], port_security=['fa:16:3e:2b:2d:3d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '29a2fe9a-add5-43c1-948a-9df854aa4261', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=f942c9c9-85a4-47cf-9428-7e266b83b49b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.276 144543 INFO neutron.agent.ovn.metadata.agent [-] Port f942c9c9-85a4-47cf-9428-7e266b83b49b in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.278 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:34:17 compute-1 NetworkManager[45549]: <info>  [1759257257.2976] device (tapf942c9c9-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.299 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f68fa48d-19b1-4cd4-966e-bb9ac86b6c4e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 NetworkManager[45549]: <info>  [1759257257.3037] device (tapf942c9c9-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:34:17 compute-1 nova_compute[238822]: 2025-09-30 18:34:17.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:17 compute-1 nova_compute[238822]: 2025-09-30 18:34:17.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:17 compute-1 ovn_controller[135204]: 2025-09-30T18:34:17Z|00208|binding|INFO|Setting lport f942c9c9-85a4-47cf-9428-7e266b83b49b ovn-installed in OVS
Sep 30 18:34:17 compute-1 systemd-machined[195911]: New machine qemu-19-instance-00000019.
Sep 30 18:34:17 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.364 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[e6863c53-b9ed-4fa4-8a07-b8b1515ec62a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.368 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d403779a-bc64-461c-a6fd-0a41121afaef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.415 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[08f1485b-2644-4348-805d-89a9aafcd890]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.443 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[91796daa-3037-44f7-ab92-c55cda2fbc56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1507172, 'reachable_time': 21460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290861, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.473 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d6321f2d-3f75-429b-8fe6-0baeab5d73cc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1507195, 'tstamp': 1507195}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290862, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1507200, 'tstamp': 1507200}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290862, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.475 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:17 compute-1 nova_compute[238822]: 2025-09-30 18:34:17.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:17 compute-1 nova_compute[238822]: 2025-09-30 18:34:17.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.478 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.479 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.480 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.480 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:34:17 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:17.482 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cd94a8d0-404c-4956-862a-51708f33e3b7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:17.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:18 compute-1 ceph-mon[75484]: pgmap v1671: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 614 B/s rd, 9.7 KiB/s wr, 2 op/s
Sep 30 18:34:18 compute-1 nova_compute[238822]: 2025-09-30 18:34:18.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:18 compute-1 nova_compute[238822]: 2025-09-30 18:34:18.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:18.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: ERROR   18:34:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: ERROR   18:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: ERROR   18:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: ERROR   18:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: ERROR   18:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:34:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:34:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:20 compute-1 ovn_controller[135204]: 2025-09-30T18:34:20Z|00209|binding|INFO|Claiming lport f942c9c9-85a4-47cf-9428-7e266b83b49b for this chassis.
Sep 30 18:34:20 compute-1 ovn_controller[135204]: 2025-09-30T18:34:20Z|00210|binding|INFO|f942c9c9-85a4-47cf-9428-7e266b83b49b: Claiming fa:16:3e:2b:2d:3d 10.100.0.5
Sep 30 18:34:20 compute-1 ovn_controller[135204]: 2025-09-30T18:34:20Z|00211|binding|INFO|Setting lport f942c9c9-85a4-47cf-9428-7e266b83b49b up in Southbound
Sep 30 18:34:20 compute-1 ceph-mon[75484]: pgmap v1672: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 2.7 KiB/s rd, 8.2 KiB/s wr, 4 op/s
Sep 30 18:34:20 compute-1 podman[290911]: 2025-09-30 18:34:20.58219402 +0000 UTC m=+0.109009218 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:34:20 compute-1 podman[290910]: 2025-09-30 18:34:20.628841661 +0000 UTC m=+0.156795719 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:34:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:20.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.406 2 INFO nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Post operation of migration started
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.407 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.549 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.550 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.654 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.655 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:34:21 compute-1 nova_compute[238822]: 2025-09-30 18:34:21.656 2 DEBUG nova.network.neutron [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:34:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:21.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:22 compute-1 nova_compute[238822]: 2025-09-30 18:34:22.162 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:22 compute-1 ceph-mon[75484]: pgmap v1673: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 8.2 KiB/s wr, 7 op/s
Sep 30 18:34:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:22.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:23 compute-1 nova_compute[238822]: 2025-09-30 18:34:23.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:34:23 compute-1 nova_compute[238822]: 2025-09-30 18:34:23.402 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:23.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:23 compute-1 nova_compute[238822]: 2025-09-30 18:34:23.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:24 compute-1 ceph-mon[75484]: pgmap v1674: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 8.2 KiB/s wr, 6 op/s
Sep 30 18:34:24 compute-1 nova_compute[238822]: 2025-09-30 18:34:24.567 2 DEBUG nova.network.neutron [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Updating instance_info_cache with network_info: [{"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:24.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:25 compute-1 nova_compute[238822]: 2025-09-30 18:34:25.076 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-29a2fe9a-add5-43c1-948a-9df854aa4261" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:34:25 compute-1 nova_compute[238822]: 2025-09-30 18:34:25.605 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:25 compute-1 nova_compute[238822]: 2025-09-30 18:34:25.607 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:25 compute-1 nova_compute[238822]: 2025-09-30 18:34:25.607 2 DEBUG oslo_concurrency.lockutils [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:25 compute-1 nova_compute[238822]: 2025-09-30 18:34:25.614 2 INFO nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:34:25 compute-1 virtqemud[239124]: Domain id=19 name='instance-00000019' uuid=29a2fe9a-add5-43c1-948a-9df854aa4261 is tainted: custom-monitor
Sep 30 18:34:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:25.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:26 compute-1 ceph-mon[75484]: pgmap v1675: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 8.2 KiB/s wr, 7 op/s
Sep 30 18:34:26 compute-1 nova_compute[238822]: 2025-09-30 18:34:26.625 2 INFO nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:34:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:34:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:26.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:34:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:27 compute-1 podman[290969]: 2025-09-30 18:34:27.601655901 +0000 UTC m=+0.131389653 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:34:27 compute-1 nova_compute[238822]: 2025-09-30 18:34:27.632 2 INFO nova.virt.libvirt.driver [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:34:27 compute-1 nova_compute[238822]: 2025-09-30 18:34:27.639 2 DEBUG nova.compute.manager [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:34:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:28 compute-1 nova_compute[238822]: 2025-09-30 18:34:28.153 2 DEBUG nova.objects.instance [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:34:28 compute-1 nova_compute[238822]: 2025-09-30 18:34:28.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:28 compute-1 ceph-mon[75484]: pgmap v1676: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:34:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:28.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:28 compute-1 nova_compute[238822]: 2025-09-30 18:34:28.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:29 compute-1 nova_compute[238822]: 2025-09-30 18:34:29.176 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:29 compute-1 nova_compute[238822]: 2025-09-30 18:34:29.494 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:29 compute-1 nova_compute[238822]: 2025-09-30 18:34:29.495 2 WARNING neutronclient.v2_0.client [None req-5bbdd9fe-04ff-4dd1-80b6-ecbd7218c9ad 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:29 compute-1 unix_chkpwd[290992]: password check failed for user (root)
Sep 30 18:34:29 compute-1 sshd-session[290988]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:34:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:29.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:30 compute-1 ceph-mon[75484]: pgmap v1677: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:34:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:30.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:31.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:32 compute-1 sshd-session[290988]: Failed password for root from 192.210.160.141 port 53642 ssh2
Sep 30 18:34:32 compute-1 ceph-mon[75484]: pgmap v1678: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 3.0 KiB/s rd, 85 B/s wr, 3 op/s
Sep 30 18:34:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1148247937' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:32.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:32 compute-1 sshd-session[290988]: Connection closed by authenticating user root 192.210.160.141 port 53642 [preauth]
Sep 30 18:34:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:33 compute-1 podman[290997]: 2025-09-30 18:34:33.56492861 +0000 UTC m=+0.098324829 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:34:33 compute-1 podman[290999]: 2025-09-30 18:34:33.575139186 +0000 UTC m=+0.096732656 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Sep 30 18:34:33 compute-1 podman[290998]: 2025-09-30 18:34:33.578354253 +0000 UTC m=+0.104527677 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:34:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:33.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.918 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "29a2fe9a-add5-43c1-948a-9df854aa4261" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.919 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.920 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.920 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.920 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:33 compute-1 nova_compute[238822]: 2025-09-30 18:34:33.941 2 INFO nova.compute.manager [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Terminating instance
Sep 30 18:34:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:34 compute-1 ceph-mon[75484]: pgmap v1679: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:34:34 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/698006576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.461 2 DEBUG nova.compute.manager [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:34:34 compute-1 kernel: tapf942c9c9-85 (unregistering): left promiscuous mode
Sep 30 18:34:34 compute-1 NetworkManager[45549]: <info>  [1759257274.5302] device (tapf942c9c9-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:34:34 compute-1 ovn_controller[135204]: 2025-09-30T18:34:34Z|00212|binding|INFO|Releasing lport f942c9c9-85a4-47cf-9428-7e266b83b49b from this chassis (sb_readonly=0)
Sep 30 18:34:34 compute-1 ovn_controller[135204]: 2025-09-30T18:34:34Z|00213|binding|INFO|Setting lport f942c9c9-85a4-47cf-9428-7e266b83b49b down in Southbound
Sep 30 18:34:34 compute-1 ovn_controller[135204]: 2025-09-30T18:34:34Z|00214|binding|INFO|Removing iface tapf942c9c9-85 ovn-installed in OVS
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.557 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:2d:3d 10.100.0.5'], port_security=['fa:16:3e:2b:2d:3d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '29a2fe9a-add5-43c1-948a-9df854aa4261', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '15', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=f942c9c9-85a4-47cf-9428-7e266b83b49b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.559 144543 INFO neutron.agent.ovn.metadata.agent [-] Port f942c9c9-85a4-47cf-9428-7e266b83b49b in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.561 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.592 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[caf9ca82-749a-418f-b940-c229c3f8e3f8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Sep 30 18:34:34 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 2.333s CPU time.
Sep 30 18:34:34 compute-1 systemd-machined[195911]: Machine qemu-19-instance-00000019 terminated.
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.639 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ce462cea-b825-41fb-997a-cbdfebc0292d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.643 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[2c307613-ab45-4427-9a9b-abe9847dc69b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.690 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f30e15-ff97-4c7c-b17e-6726fdf82f77]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.717 2 INFO nova.virt.libvirt.driver [-] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Instance destroyed successfully.
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.717 2 DEBUG nova.objects.instance [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 29a2fe9a-add5-43c1-948a-9df854aa4261 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.723 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[63cae3be-7a2f-49ac-91ea-bafd7de67ad4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1507172, 'reachable_time': 21460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291075, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.751 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[eec0fec8-a21d-4f37-a718-749fa66a955b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1507195, 'tstamp': 1507195}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291083, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1507200, 'tstamp': 1507200}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291083, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.753 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 nova_compute[238822]: 2025-09-30 18:34:34.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.764 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.765 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.765 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.766 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:34:34 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:34.768 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c766b9-6181-4afa-a12b-68e8f1bdfa91]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:34 compute-1 sshd-session[291057]: Invalid user administrator from 216.10.242.161 port 57952
Sep 30 18:34:34 compute-1 sshd-session[291057]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:34:34 compute-1 sshd-session[291057]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161
Sep 30 18:34:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:34.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.226 2 DEBUG nova.virt.libvirt.vif [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:32:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847336220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847336220',id=25,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:32:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-my5o1s4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:34:28Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=29a2fe9a-add5-43c1-948a-9df854aa4261,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.227 2 DEBUG nova.network.os_vif_util [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "address": "fa:16:3e:2b:2d:3d", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf942c9c9-85", "ovs_interfaceid": "f942c9c9-85a4-47cf-9428-7e266b83b49b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.227 2 DEBUG nova.network.os_vif_util [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.228 2 DEBUG os_vif [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf942c9c9-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9f42b388-4cbf-45ea-a2cd-5b3a966c8ce6) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.240 2 INFO os_vif [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:2d:3d,bridge_name='br-int',has_traffic_filtering=True,id=f942c9c9-85a4-47cf-9428-7e266b83b49b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf942c9c9-85')
Sep 30 18:34:35 compute-1 podman[249638]: time="2025-09-30T18:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:34:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:34:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8821 "" "Go-http-client/1.1"
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.814 2 INFO nova.virt.libvirt.driver [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Deleting instance files /var/lib/nova/instances/29a2fe9a-add5-43c1-948a-9df854aa4261_del
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.815 2 INFO nova.virt.libvirt.driver [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Deletion of /var/lib/nova/instances/29a2fe9a-add5-43c1-948a-9df854aa4261_del complete
Sep 30 18:34:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:35 compute-1 unix_chkpwd[291107]: password check failed for user (root)
Sep 30 18:34:35 compute-1 sshd-session[291101]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.947 2 DEBUG nova.compute.manager [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Received event network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.947 2 DEBUG oslo_concurrency.lockutils [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.947 2 DEBUG oslo_concurrency.lockutils [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.948 2 DEBUG oslo_concurrency.lockutils [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.948 2 DEBUG nova.compute.manager [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] No waiting events found dispatching network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:34:35 compute-1 nova_compute[238822]: 2025-09-30 18:34:35.948 2 DEBUG nova.compute.manager [req-47aeefc6-967d-4ad2-9678-fde123aac990 req-5e7940e5-569c-421f-898a-708ce6ffa36c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Received event network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:34:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.330 2 INFO nova.compute.manager [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Took 1.87 seconds to destroy the instance on the hypervisor.
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.331 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.332 2 DEBUG nova.compute.manager [-] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.332 2 DEBUG nova.network.neutron [-] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.332 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:36 compute-1 ceph-mon[75484]: pgmap v1680: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 0 B/s wr, 1 op/s
Sep 30 18:34:36 compute-1 nova_compute[238822]: 2025-09-30 18:34:36.476 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:36 compute-1 sshd-session[291057]: Failed password for invalid user administrator from 216.10.242.161 port 57952 ssh2
Sep 30 18:34:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:36.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:37 compute-1 sudo[291110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:34:37 compute-1 sudo[291110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:37 compute-1 sudo[291110]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2476786179' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:34:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2476786179' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:34:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:34:37 compute-1 nova_compute[238822]: 2025-09-30 18:34:37.718 2 DEBUG nova.compute.manager [req-bee26010-ad0e-4d8c-929e-f6f696af3a53 req-5285afd3-b3b7-4fc4-8cf8-53b02e682978 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Received event network-vif-deleted-f942c9c9-85a4-47cf-9428-7e266b83b49b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:37 compute-1 nova_compute[238822]: 2025-09-30 18:34:37.718 2 INFO nova.compute.manager [req-bee26010-ad0e-4d8c-929e-f6f696af3a53 req-5285afd3-b3b7-4fc4-8cf8-53b02e682978 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Neutron deleted interface f942c9c9-85a4-47cf-9428-7e266b83b49b; detaching it from the instance and deleting it from the info cache
Sep 30 18:34:37 compute-1 nova_compute[238822]: 2025-09-30 18:34:37.718 2 DEBUG nova.network.neutron [req-bee26010-ad0e-4d8c-929e-f6f696af3a53 req-5285afd3-b3b7-4fc4-8cf8-53b02e682978 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:37.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:38 compute-1 sshd-session[291101]: Failed password for root from 8.243.64.201 port 55570 ssh2
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.047 2 DEBUG nova.compute.manager [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Received event network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.048 2 DEBUG oslo_concurrency.lockutils [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.048 2 DEBUG oslo_concurrency.lockutils [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.048 2 DEBUG oslo_concurrency.lockutils [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.049 2 DEBUG nova.compute.manager [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] No waiting events found dispatching network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.049 2 DEBUG nova.compute.manager [req-4f85084e-ab94-4927-b2b6-bd977827f6b3 req-fe3897f7-2443-4d00-bc1f-cd66acd228da 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Received event network-vif-unplugged-f942c9c9-85a4-47cf-9428-7e266b83b49b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.109 2 DEBUG nova.network.neutron [-] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.230 2 DEBUG nova.compute.manager [req-bee26010-ad0e-4d8c-929e-f6f696af3a53 req-5285afd3-b3b7-4fc4-8cf8-53b02e682978 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Detach interface failed, port_id=f942c9c9-85a4-47cf-9428-7e266b83b49b, reason: Instance 29a2fe9a-add5-43c1-948a-9df854aa4261 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:34:38 compute-1 ceph-mon[75484]: pgmap v1681: 353 pgs: 353 active+clean; 200 MiB data, 418 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:34:38 compute-1 sshd-session[291057]: Received disconnect from 216.10.242.161 port 57952:11: Bye Bye [preauth]
Sep 30 18:34:38 compute-1 sshd-session[291057]: Disconnected from invalid user administrator 216.10.242.161 port 57952 [preauth]
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.616 2 INFO nova.compute.manager [-] [instance: 29a2fe9a-add5-43c1-948a-9df854aa4261] Took 2.28 seconds to deallocate network for instance.
Sep 30 18:34:38 compute-1 sshd-session[291101]: Received disconnect from 8.243.64.201 port 55570:11: Bye Bye [preauth]
Sep 30 18:34:38 compute-1 sshd-session[291101]: Disconnected from authenticating user root 8.243.64.201 port 55570 [preauth]
Sep 30 18:34:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:38.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:38 compute-1 nova_compute[238822]: 2025-09-30 18:34:38.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:39 compute-1 nova_compute[238822]: 2025-09-30 18:34:39.146 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:39 compute-1 nova_compute[238822]: 2025-09-30 18:34:39.147 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:39 compute-1 nova_compute[238822]: 2025-09-30 18:34:39.153 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:39 compute-1 nova_compute[238822]: 2025-09-30 18:34:39.193 2 INFO nova.scheduler.client.report [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 29a2fe9a-add5-43c1-948a-9df854aa4261
Sep 30 18:34:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:39.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:40 compute-1 nova_compute[238822]: 2025-09-30 18:34:40.222 2 DEBUG oslo_concurrency.lockutils [None req-e1f49dad-68ed-4688-9a31-86b007eefba9 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "29a2fe9a-add5-43c1-948a-9df854aa4261" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.303s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:40 compute-1 nova_compute[238822]: 2025-09-30 18:34:40.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:40 compute-1 ceph-mon[75484]: pgmap v1682: 353 pgs: 353 active+clean; 140 MiB data, 380 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1.2 KiB/s wr, 19 op/s
Sep 30 18:34:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:40.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.008 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "78e1566d-9c5e-49b1-a044-0c46cf002c66" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.009 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.010 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.010 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.010 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.025 2 INFO nova.compute.manager [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Terminating instance
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.545 2 DEBUG nova.compute.manager [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:34:41 compute-1 kernel: tap48ba4743-59 (unregistering): left promiscuous mode
Sep 30 18:34:41 compute-1 NetworkManager[45549]: <info>  [1759257281.6018] device (tap48ba4743-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:34:41 compute-1 ovn_controller[135204]: 2025-09-30T18:34:41Z|00215|binding|INFO|Releasing lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 from this chassis (sb_readonly=0)
Sep 30 18:34:41 compute-1 ovn_controller[135204]: 2025-09-30T18:34:41Z|00216|binding|INFO|Setting lport 48ba4743-596d-47a6-a246-afe70e6e1fc6 down in Southbound
Sep 30 18:34:41 compute-1 ovn_controller[135204]: 2025-09-30T18:34:41Z|00217|binding|INFO|Removing iface tap48ba4743-59 ovn-installed in OVS
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:41 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:41.622 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:e6:35 10.100.0.12'], port_security=['fa:16:3e:47:e6:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '78e1566d-9c5e-49b1-a044-0c46cf002c66', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '15', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=48ba4743-596d-47a6-a246-afe70e6e1fc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:34:41 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:41.623 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 48ba4743-596d-47a6-a246-afe70e6e1fc6 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:34:41 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:41.625 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:34:41 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:41.626 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[47017e7d-5021-4656-b591-0fd756a66098]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:41 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:41.627 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:41 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Sep 30 18:34:41 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 4.028s CPU time.
Sep 30 18:34:41 compute-1 systemd-machined[195911]: Machine qemu-18-instance-00000018 terminated.
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.800 2 INFO nova.virt.libvirt.driver [-] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Instance destroyed successfully.
Sep 30 18:34:41 compute-1 nova_compute[238822]: 2025-09-30 18:34:41.803 2 DEBUG nova.objects.instance [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 78e1566d-9c5e-49b1-a044-0c46cf002c66 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:34:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:41.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:41 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [NOTICE]   (290555) : haproxy version is 3.0.5-8e879a5
Sep 30 18:34:41 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [NOTICE]   (290555) : path to executable is /usr/sbin/haproxy
Sep 30 18:34:41 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [WARNING]  (290555) : Exiting Master process...
Sep 30 18:34:41 compute-1 podman[291168]: 2025-09-30 18:34:41.868340097 +0000 UTC m=+0.059433667 container kill 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:34:41 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [ALERT]    (290555) : Current worker (290557) exited with code 143 (Terminated)
Sep 30 18:34:41 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[290551]: [WARNING]  (290555) : All workers exited. Exiting... (0)
Sep 30 18:34:41 compute-1 systemd[1]: libpod-23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4.scope: Deactivated successfully.
Sep 30 18:34:41 compute-1 podman[291190]: 2025-09-30 18:34:41.943652542 +0000 UTC m=+0.050713511 container died 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:34:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:41 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4-userdata-shm.mount: Deactivated successfully.
Sep 30 18:34:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-ccfd5cdc78e07f71e4fce4c1196b9bcbc4a5279dd3c4c929e9ff9c2d7197ced9-merged.mount: Deactivated successfully.
Sep 30 18:34:42 compute-1 podman[291190]: 2025-09-30 18:34:42.000844118 +0000 UTC m=+0.107905037 container cleanup 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:34:42 compute-1 systemd[1]: libpod-conmon-23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4.scope: Deactivated successfully.
Sep 30 18:34:42 compute-1 podman[291193]: 2025-09-30 18:34:42.045214677 +0000 UTC m=+0.127485726 container remove 23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.054 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fef9ebe8-6da4-4d7a-9938-549882babc64]: (4, ("Tue Sep 30 06:34:41 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4)\n23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4\nTue Sep 30 06:34:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4)\n23d3d599a0179717a400ff91d9ca97ff7f6813917c5a525ad8722fe24345f0a4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.057 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2ffce3f6-cac1-41a0-a0c2-a5abcd9ff5dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.058 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.059 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a6af8c1a-b49f-487c-b1a9-3e7ee4c2e0aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.060 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.103 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[de0c1cd4-7b33-4ba6-842f-35c4c8877366]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.134 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[01b902e6-43af-444d-a3f9-cdc85f2fec8e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.135 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0f70fb-4e9a-40e7-93b0-2acbb36bd7c4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.164 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[76528197-75cb-445c-90ab-0f8efbc6dad6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1507159, 'reachable_time': 23777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291228, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.167 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:34:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:42.167 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[c17eb988-7df2-47d6-95fa-71146fafc125]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:34:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.310 2 DEBUG nova.virt.libvirt.vif [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-760306639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-760306639',id=24,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:32:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-7ntnt7t9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:34:00Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=78e1566d-9c5e-49b1-a044-0c46cf002c66,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.311 2 DEBUG nova.network.os_vif_util [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "address": "fa:16:3e:47:e6:35", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48ba4743-59", "ovs_interfaceid": "48ba4743-596d-47a6-a246-afe70e6e1fc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.312 2 DEBUG nova.network.os_vif_util [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.312 2 DEBUG os_vif [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.314 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ba4743-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5e451177-af5d-4ae5-8e22-e60110d9117c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.325 2 INFO os_vif [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:e6:35,bridge_name='br-int',has_traffic_filtering=True,id=48ba4743-596d-47a6-a246-afe70e6e1fc6,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48ba4743-59')
Sep 30 18:34:42 compute-1 ceph-mon[75484]: pgmap v1683: 353 pgs: 353 active+clean; 121 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.672 2 DEBUG nova.compute.manager [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Received event network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.673 2 DEBUG oslo_concurrency.lockutils [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.673 2 DEBUG oslo_concurrency.lockutils [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.674 2 DEBUG oslo_concurrency.lockutils [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.674 2 DEBUG nova.compute.manager [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] No waiting events found dispatching network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.675 2 DEBUG nova.compute.manager [req-edd166f7-2a5c-45fe-856f-17eeccda9d31 req-0efc28a9-c6ad-456a-afeb-e81f19a44f1f 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Received event network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.797 2 INFO nova.virt.libvirt.driver [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Deleting instance files /var/lib/nova/instances/78e1566d-9c5e-49b1-a044-0c46cf002c66_del
Sep 30 18:34:42 compute-1 nova_compute[238822]: 2025-09-30 18:34:42.798 2 INFO nova.virt.libvirt.driver [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Deletion of /var/lib/nova/instances/78e1566d-9c5e-49b1-a044-0c46cf002c66_del complete
Sep 30 18:34:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:42.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.312 2 INFO nova.compute.manager [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Took 1.77 seconds to destroy the instance on the hypervisor.
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.313 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.314 2 DEBUG nova.compute.manager [-] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.314 2 DEBUG nova.network.neutron [-] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.314 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.444 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:34:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:43.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:43 compute-1 nova_compute[238822]: 2025-09-30 18:34:43.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:44 compute-1 ceph-mon[75484]: pgmap v1684: 353 pgs: 353 active+clean; 121 MiB data, 372 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.572 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.660 2 DEBUG nova.compute.manager [req-36468597-f467-468f-8fe5-504feaf403a7 req-f12f7c35-0e96-41cd-8edd-e37e195aab4a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Received event network-vif-deleted-48ba4743-596d-47a6-a246-afe70e6e1fc6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.660 2 INFO nova.compute.manager [req-36468597-f467-468f-8fe5-504feaf403a7 req-f12f7c35-0e96-41cd-8edd-e37e195aab4a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Neutron deleted interface 48ba4743-596d-47a6-a246-afe70e6e1fc6; detaching it from the instance and deleting it from the info cache
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.661 2 DEBUG nova.network.neutron [req-36468597-f467-468f-8fe5-504feaf403a7 req-f12f7c35-0e96-41cd-8edd-e37e195aab4a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.764 2 DEBUG nova.compute.manager [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Received event network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.764 2 DEBUG oslo_concurrency.lockutils [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.765 2 DEBUG oslo_concurrency.lockutils [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.765 2 DEBUG oslo_concurrency.lockutils [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.766 2 DEBUG nova.compute.manager [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] No waiting events found dispatching network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:34:44 compute-1 nova_compute[238822]: 2025-09-30 18:34:44.766 2 DEBUG nova.compute.manager [req-9a73626c-88cc-4064-9a61-c87ef43d1e67 req-bdba3388-4016-417b-9204-0534ee8ba878 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Received event network-vif-unplugged-48ba4743-596d-47a6-a246-afe70e6e1fc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:34:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:34:45 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1617100929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.064 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.106 2 DEBUG nova.network.neutron [-] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.173 2 DEBUG nova.compute.manager [req-36468597-f467-468f-8fe5-504feaf403a7 req-f12f7c35-0e96-41cd-8edd-e37e195aab4a 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Detach interface failed, port_id=48ba4743-596d-47a6-a246-afe70e6e1fc6, reason: Instance 78e1566d-9c5e-49b1-a044-0c46cf002c66 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.305 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.307 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.331 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.331 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4747MB free_disk=39.9466667175293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.332 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.332 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1617100929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:45 compute-1 nova_compute[238822]: 2025-09-30 18:34:45.612 2 INFO nova.compute.manager [-] [instance: 78e1566d-9c5e-49b1-a044-0c46cf002c66] Took 2.30 seconds to deallocate network for instance.
Sep 30 18:34:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:45.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.137 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.378 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 78e1566d-9c5e-49b1-a044-0c46cf002c66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.379 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.379 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:34:45 up  4:12,  0 user,  load average: 0.25, 0.30, 0.56\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.412 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:34:46 compute-1 ceph-mon[75484]: pgmap v1685: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:34:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:34:46 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/331097220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.928 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:34:46 compute-1 nova_compute[238822]: 2025-09-30 18:34:46.936 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:34:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:47 compute-1 nova_compute[238822]: 2025-09-30 18:34:47.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:47 compute-1 nova_compute[238822]: 2025-09-30 18:34:47.449 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:34:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/331097220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/24173303' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:47.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:47 compute-1 nova_compute[238822]: 2025-09-30 18:34:47.964 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:34:47 compute-1 nova_compute[238822]: 2025-09-30 18:34:47.964 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.632s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:47 compute-1 nova_compute[238822]: 2025-09-30 18:34:47.965 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.828s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:48 compute-1 nova_compute[238822]: 2025-09-30 18:34:48.014 2 DEBUG oslo_concurrency.processutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:34:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:34:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2946357604' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:48 compute-1 nova_compute[238822]: 2025-09-30 18:34:48.486 2 DEBUG oslo_concurrency.processutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:34:48 compute-1 ceph-mon[75484]: pgmap v1686: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:34:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2946357604' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:48 compute-1 nova_compute[238822]: 2025-09-30 18:34:48.495 2 DEBUG nova.compute.provider_tree [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:34:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:48.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:48 compute-1 nova_compute[238822]: 2025-09-30 18:34:48.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:49 compute-1 nova_compute[238822]: 2025-09-30 18:34:49.006 2 DEBUG nova.scheduler.client.report [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:34:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: ERROR   18:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: ERROR   18:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: ERROR   18:34:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: ERROR   18:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: ERROR   18:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:34:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:34:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/591739360' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:49 compute-1 nova_compute[238822]: 2025-09-30 18:34:49.519 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.555s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:49 compute-1 nova_compute[238822]: 2025-09-30 18:34:49.543 2 INFO nova.scheduler.client.report [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 78e1566d-9c5e-49b1-a044-0c46cf002c66
Sep 30 18:34:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:49.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:50 compute-1 ceph-mon[75484]: pgmap v1687: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:34:50 compute-1 nova_compute[238822]: 2025-09-30 18:34:50.582 2 DEBUG oslo_concurrency.lockutils [None req-29a4e07f-47cc-4d26-ac0a-d23cad725165 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "78e1566d-9c5e-49b1-a044-0c46cf002c66" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.573s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:50.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:51 compute-1 podman[291326]: 2025-09-30 18:34:51.540121411 +0000 UTC m=+0.080137647 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:34:51 compute-1 podman[291325]: 2025-09-30 18:34:51.564419338 +0000 UTC m=+0.110286862 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:34:51 compute-1 ceph-mon[75484]: pgmap v1688: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 26 KiB/s rd, 1.2 KiB/s wr, 36 op/s
Sep 30 18:34:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:51.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:51 compute-1 nova_compute[238822]: 2025-09-30 18:34:51.965 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:51 compute-1 nova_compute[238822]: 2025-09-30 18:34:51.966 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:51 compute-1 nova_compute[238822]: 2025-09-30 18:34:51.966 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:52 compute-1 nova_compute[238822]: 2025-09-30 18:34:52.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:34:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:53 compute-1 ceph-mon[75484]: pgmap v1689: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:34:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:53.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:53 compute-1 nova_compute[238822]: 2025-09-30 18:34:53.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:54 compute-1 nova_compute[238822]: 2025-09-30 18:34:54.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:54.399 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:34:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:54.400 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:34:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:34:54.400 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:34:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:54.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:55 compute-1 nova_compute[238822]: 2025-09-30 18:34:55.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:55 compute-1 sshd-session[291377]: Invalid user administrator from 14.225.167.110 port 55880
Sep 30 18:34:55 compute-1 sshd-session[291377]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:34:55 compute-1 sshd-session[291377]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:34:55 compute-1 ceph-mon[75484]: pgmap v1690: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:34:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:55.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:56 compute-1 nova_compute[238822]: 2025-09-30 18:34:56.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:34:56 compute-1 nova_compute[238822]: 2025-09-30 18:34:56.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:34:56 compute-1 unix_chkpwd[291385]: password check failed for user (root)
Sep 30 18:34:56 compute-1 sshd-session[291380]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:34:56 compute-1 sshd-session[291377]: Failed password for invalid user administrator from 14.225.167.110 port 55880 ssh2
Sep 30 18:34:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:34:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:56.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:34:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:57 compute-1 nova_compute[238822]: 2025-09-30 18:34:57.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:57 compute-1 sudo[291387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:34:57 compute-1 sudo[291387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:34:57 compute-1 sudo[291387]: pam_unix(sudo:session): session closed for user root
Sep 30 18:34:57 compute-1 sshd-session[291377]: Received disconnect from 14.225.167.110 port 55880:11: Bye Bye [preauth]
Sep 30 18:34:57 compute-1 sshd-session[291377]: Disconnected from invalid user administrator 14.225.167.110 port 55880 [preauth]
Sep 30 18:34:57 compute-1 ceph-mon[75484]: pgmap v1691: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:34:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3353939610' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:34:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3353939610' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:34:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:57.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:58 compute-1 sshd-session[291380]: Failed password for root from 192.210.160.141 port 50780 ssh2
Sep 30 18:34:58 compute-1 podman[291413]: 2025-09-30 18:34:58.559990783 +0000 UTC m=+0.091758172 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:34:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4158659323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:34:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:34:58.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:58 compute-1 nova_compute[238822]: 2025-09-30 18:34:58.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:34:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:34:59 compute-1 sshd-session[291430]: Invalid user wikijs from 167.172.43.167 port 37200
Sep 30 18:34:59 compute-1 sshd-session[291430]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:34:59 compute-1 sshd-session[291430]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:34:59 compute-1 sshd-session[291380]: Connection closed by authenticating user root 192.210.160.141 port 50780 [preauth]
Sep 30 18:34:59 compute-1 ceph-mon[75484]: pgmap v1692: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:34:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:34:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:34:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:34:59.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:34:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:34:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:34:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:34:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:00 compute-1 nova_compute[238822]: 2025-09-30 18:35:00.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:01 compute-1 sshd-session[291430]: Failed password for invalid user wikijs from 167.172.43.167 port 37200 ssh2
Sep 30 18:35:01 compute-1 sshd-session[291430]: Received disconnect from 167.172.43.167 port 37200:11: Bye Bye [preauth]
Sep 30 18:35:01 compute-1 sshd-session[291430]: Disconnected from invalid user wikijs 167.172.43.167 port 37200 [preauth]
Sep 30 18:35:01 compute-1 ceph-mon[75484]: pgmap v1693: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:35:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:01.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:02 compute-1 nova_compute[238822]: 2025-09-30 18:35:02.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:02.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:03 compute-1 ceph-mon[75484]: pgmap v1694: 353 pgs: 353 active+clean; 41 MiB data, 325 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:35:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:03.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:03 compute-1 nova_compute[238822]: 2025-09-30 18:35:03.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:04 compute-1 podman[291442]: 2025-09-30 18:35:04.56927511 +0000 UTC m=+0.092227204 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 18:35:04 compute-1 podman[291441]: 2025-09-30 18:35:04.57002508 +0000 UTC m=+0.105922544 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:35:04 compute-1 podman[291440]: 2025-09-30 18:35:04.586594318 +0000 UTC m=+0.122255576 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Sep 30 18:35:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:04.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:05 compute-1 podman[249638]: time="2025-09-30T18:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:35:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:35:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8361 "" "Go-http-client/1.1"
Sep 30 18:35:05 compute-1 ceph-mon[75484]: pgmap v1695: 353 pgs: 353 active+clean; 88 MiB data, 346 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:35:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:05.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:06.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:07 compute-1 nova_compute[238822]: 2025-09-30 18:35:07.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:07 compute-1 ceph-mon[75484]: pgmap v1696: 353 pgs: 353 active+clean; 88 MiB data, 346 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:35:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:35:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:07.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:08 compute-1 nova_compute[238822]: 2025-09-30 18:35:08.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:08 compute-1 nova_compute[238822]: 2025-09-30 18:35:08.565 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:35:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:08.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:08 compute-1 nova_compute[238822]: 2025-09-30 18:35:08.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:09 compute-1 nova_compute[238822]: 2025-09-30 18:35:09.075 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:35:09 compute-1 ceph-mon[75484]: pgmap v1697: 353 pgs: 353 active+clean; 88 MiB data, 346 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:35:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2763612461' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:35:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1596084137' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:35:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:09.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:10.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:11 compute-1 sudo[291507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:35:11 compute-1 sudo[291507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:11 compute-1 sudo[291507]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:11 compute-1 sudo[291532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:35:11 compute-1 sudo[291532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:11.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:11 compute-1 ceph-mon[75484]: pgmap v1698: 353 pgs: 353 active+clean; 88 MiB data, 346 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Sep 30 18:35:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:12 compute-1 sudo[291532]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:12 compute-1 nova_compute[238822]: 2025-09-30 18:35:12.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:12.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:35:12 compute-1 ceph-mon[75484]: pgmap v1699: 353 pgs: 353 active+clean; 88 MiB data, 346 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:35:12 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:35:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:13.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:13 compute-1 nova_compute[238822]: 2025-09-30 18:35:13.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:14.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:15 compute-1 ceph-mon[75484]: pgmap v1700: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 22 KiB/s rd, 1.8 MiB/s wr, 34 op/s
Sep 30 18:35:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:15.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:16.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:17 compute-1 sudo[291596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:35:17 compute-1 sudo[291596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:17 compute-1 sudo[291596]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:17 compute-1 nova_compute[238822]: 2025-09-30 18:35:17.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:17 compute-1 sudo[291621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:35:17 compute-1 sudo[291621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:17 compute-1 sudo[291621]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:17 compute-1 ceph-mon[75484]: pgmap v1701: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 4.1 KiB/s rd, 14 KiB/s wr, 6 op/s
Sep 30 18:35:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:35:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:35:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:17.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:18.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:18 compute-1 nova_compute[238822]: 2025-09-30 18:35:18.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: ERROR   18:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: ERROR   18:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: ERROR   18:35:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: ERROR   18:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: ERROR   18:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:35:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:35:19 compute-1 ceph-mon[75484]: pgmap v1702: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 4.1 KiB/s rd, 14 KiB/s wr, 6 op/s
Sep 30 18:35:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:19.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:21 compute-1 ceph-mon[75484]: pgmap v1703: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 14 KiB/s wr, 77 op/s
Sep 30 18:35:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:21.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:22 compute-1 nova_compute[238822]: 2025-09-30 18:35:22.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:35:22 compute-1 podman[291655]: 2025-09-30 18:35:22.558219645 +0000 UTC m=+0.089772128 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:35:22 compute-1 podman[291654]: 2025-09-30 18:35:22.603756675 +0000 UTC m=+0.134603449 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:35:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:22.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:23 compute-1 unix_chkpwd[291707]: password check failed for user (root)
Sep 30 18:35:23 compute-1 sshd-session[291651]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:35:23 compute-1 ceph-mon[75484]: pgmap v1704: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 14 KiB/s wr, 76 op/s
Sep 30 18:35:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:23.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:23 compute-1 nova_compute[238822]: 2025-09-30 18:35:23.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3008404670' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:24 compute-1 sshd-session[291651]: Failed password for root from 192.210.160.141 port 51636 ssh2
Sep 30 18:35:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:25 compute-1 ceph-mon[75484]: pgmap v1705: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:35:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:25 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:25.939 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:35:25 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:25.940 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:35:25 compute-1 nova_compute[238822]: 2025-09-30 18:35:25.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:26 compute-1 sshd-session[291651]: Connection closed by authenticating user root 192.210.160.141 port 51636 [preauth]
Sep 30 18:35:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:26.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:27 compute-1 ceph-mon[75484]: pgmap v1706: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 69 op/s
Sep 30 18:35:27 compute-1 nova_compute[238822]: 2025-09-30 18:35:27.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:27.942 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:35:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:35:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:28.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:35:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:28 compute-1 nova_compute[238822]: 2025-09-30 18:35:28.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:29 compute-1 podman[291715]: 2025-09-30 18:35:29.541687091 +0000 UTC m=+0.080987951 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:35:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:29.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:30 compute-1 ceph-mon[75484]: pgmap v1707: 353 pgs: 353 active+clean; 88 MiB data, 347 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 69 op/s
Sep 30 18:35:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:30.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:31 compute-1 ceph-mon[75484]: pgmap v1708: 353 pgs: 353 active+clean; 161 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 138 op/s
Sep 30 18:35:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:32 compute-1 nova_compute[238822]: 2025-09-30 18:35:32.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:32 compute-1 ceph-mon[75484]: pgmap v1709: 353 pgs: 353 active+clean; 161 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 338 KiB/s rd, 3.9 MiB/s wr, 69 op/s
Sep 30 18:35:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:32.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:33.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:33 compute-1 nova_compute[238822]: 2025-09-30 18:35:33.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:35 compute-1 podman[291743]: 2025-09-30 18:35:35.554217083 +0000 UTC m=+0.084440443 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Sep 30 18:35:35 compute-1 podman[291744]: 2025-09-30 18:35:35.556362031 +0000 UTC m=+0.085271766 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:35:35 compute-1 podman[291742]: 2025-09-30 18:35:35.562812656 +0000 UTC m=+0.087909158 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 18:35:35 compute-1 podman[249638]: time="2025-09-30T18:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:35:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:35:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8361 "" "Go-http-client/1.1"
Sep 30 18:35:35 compute-1 ceph-mon[75484]: pgmap v1710: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 408 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:35:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2597361341' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:35:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2903216319' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:35:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:35.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1599935679' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:35:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1599935679' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:35:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:36.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:37 compute-1 nova_compute[238822]: 2025-09-30 18:35:37.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:37 compute-1 sudo[291805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:35:37 compute-1 sudo[291805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:37 compute-1 sudo[291805]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:37 compute-1 ceph-mon[75484]: pgmap v1711: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:35:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:35:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:37.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:38 compute-1 ceph-mon[75484]: pgmap v1712: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 407 KiB/s rd, 3.9 MiB/s wr, 92 op/s
Sep 30 18:35:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:38.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:38 compute-1 nova_compute[238822]: 2025-09-30 18:35:38.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:39.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:41 compute-1 ceph-mon[75484]: pgmap v1713: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 408 KiB/s rd, 3.9 MiB/s wr, 93 op/s
Sep 30 18:35:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:41.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:42 compute-1 unix_chkpwd[291836]: password check failed for user (root)
Sep 30 18:35:42 compute-1 sshd-session[291834]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=216.10.242.161  user=root
Sep 30 18:35:42 compute-1 nova_compute[238822]: 2025-09-30 18:35:42.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:42.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:43 compute-1 ceph-mon[75484]: pgmap v1714: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 70 KiB/s rd, 48 KiB/s wr, 23 op/s
Sep 30 18:35:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:43.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:43 compute-1 nova_compute[238822]: 2025-09-30 18:35:43.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:44 compute-1 sshd-session[291834]: Failed password for root from 216.10.242.161 port 36796 ssh2
Sep 30 18:35:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:44 compute-1 nova_compute[238822]: 2025-09-30 18:35:44.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:44 compute-1 sshd-session[291834]: Received disconnect from 216.10.242.161 port 36796:11: Bye Bye [preauth]
Sep 30 18:35:44 compute-1 sshd-session[291834]: Disconnected from authenticating user root 216.10.242.161 port 36796 [preauth]
Sep 30 18:35:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:44.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.078 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.079 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.079 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.079 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:45 compute-1 ceph-mon[75484]: pgmap v1715: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 61 KiB/s wr, 97 op/s
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.623 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.624 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.624 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.625 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:35:45 compute-1 nova_compute[238822]: 2025-09-30 18:35:45.625 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:35:45 compute-1 sshd-session[291841]: Invalid user cristi from 8.243.64.201 port 39178
Sep 30 18:35:45 compute-1 sshd-session[291841]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:35:45 compute-1 sshd-session[291841]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:35:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:45.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:35:46 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1171474475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.060 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.260 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.263 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.286 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.287 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4736MB free_disk=39.92577362060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.287 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:35:46 compute-1 nova_compute[238822]: 2025-09-30 18:35:46.288 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:35:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1171474475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:46.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:35:46 up  4:13,  0 user,  load average: 0.66, 0.43, 0.59\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:35:47 compute-1 sshd-session[291841]: Failed password for invalid user cristi from 8.243.64.201 port 39178 ssh2
Sep 30 18:35:47 compute-1 ceph-mon[75484]: pgmap v1716: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.503 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.568 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.569 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.587 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.606 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:35:47 compute-1 nova_compute[238822]: 2025-09-30 18:35:47.627 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:35:47 compute-1 sshd-session[291841]: Received disconnect from 8.243.64.201 port 39178:11: Bye Bye [preauth]
Sep 30 18:35:47 compute-1 sshd-session[291841]: Disconnected from invalid user cristi 8.243.64.201 port 39178 [preauth]
Sep 30 18:35:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:47.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:35:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3128617664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:48 compute-1 nova_compute[238822]: 2025-09-30 18:35:48.143 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:35:48 compute-1 nova_compute[238822]: 2025-09-30 18:35:48.151 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:35:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3128617664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:48 compute-1 nova_compute[238822]: 2025-09-30 18:35:48.661 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:35:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:48.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:49 compute-1 nova_compute[238822]: 2025-09-30 18:35:49.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:49 compute-1 sshd-session[291868]: Invalid user student from 192.210.160.141 port 51308
Sep 30 18:35:49 compute-1 sshd-session[291868]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:35:49 compute-1 sshd-session[291868]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:35:49 compute-1 nova_compute[238822]: 2025-09-30 18:35:49.176 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:35:49 compute-1 nova_compute[238822]: 2025-09-30 18:35:49.176 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.888s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: ERROR   18:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: ERROR   18:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: ERROR   18:35:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: ERROR   18:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: ERROR   18:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:35:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:35:49 compute-1 ceph-mon[75484]: pgmap v1717: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:35:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/306343805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:50.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:51 compute-1 sshd-session[291868]: Failed password for invalid user student from 192.210.160.141 port 51308 ssh2
Sep 30 18:35:51 compute-1 ceph-mon[75484]: pgmap v1718: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 27 KiB/s wr, 75 op/s
Sep 30 18:35:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:52 compute-1 nova_compute[238822]: 2025-09-30 18:35:52.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3988588815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:35:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:35:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:52.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:53 compute-1 nova_compute[238822]: 2025-09-30 18:35:53.155 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:53 compute-1 nova_compute[238822]: 2025-09-30 18:35:53.155 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:53 compute-1 nova_compute[238822]: 2025-09-30 18:35:53.155 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:53 compute-1 podman[291899]: 2025-09-30 18:35:53.551905623 +0000 UTC m=+0.085389018 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:35:53 compute-1 podman[291898]: 2025-09-30 18:35:53.601013271 +0000 UTC m=+0.136832279 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:35:53 compute-1 ceph-mon[75484]: pgmap v1719: 353 pgs: 353 active+clean; 167 MiB data, 393 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:35:53 compute-1 sshd-session[291868]: Connection closed by invalid user student 192.210.160.141 port 51308 [preauth]
Sep 30 18:35:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:54 compute-1 nova_compute[238822]: 2025-09-30 18:35:54.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:54.401 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:35:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:54.401 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:35:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:35:54.401 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:35:54 compute-1 ceph-mon[75484]: pgmap v1720: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 138 op/s
Sep 30 18:35:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:55.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:55 compute-1 nova_compute[238822]: 2025-09-30 18:35:55.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:55 compute-1 nova_compute[238822]: 2025-09-30 18:35:55.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:35:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:55.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:57.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:57 compute-1 ceph-mon[75484]: pgmap v1721: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:35:57 compute-1 nova_compute[238822]: 2025-09-30 18:35:57.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:57 compute-1 sudo[291953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:35:57 compute-1 sudo[291953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:35:57 compute-1 sudo[291953]: pam_unix(sudo:session): session closed for user root
Sep 30 18:35:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:57.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4216399660' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:35:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4216399660' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:35:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:35:59 compute-1 nova_compute[238822]: 2025-09-30 18:35:59.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:35:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:35:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:35:59.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:35:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:35:59 compute-1 ceph-mon[75484]: pgmap v1722: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:35:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:35:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:35:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:35:59.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:35:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:35:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:35:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:00 compute-1 podman[291981]: 2025-09-30 18:36:00.548427955 +0000 UTC m=+0.086329635 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 18:36:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:01.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:01 compute-1 ceph-mon[75484]: pgmap v1723: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:36:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:01.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:02 compute-1 nova_compute[238822]: 2025-09-30 18:36:02.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:03.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:03 compute-1 ceph-mon[75484]: pgmap v1724: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:36:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:03.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:04 compute-1 nova_compute[238822]: 2025-09-30 18:36:04.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:04 compute-1 unix_chkpwd[292007]: password check failed for user (root)
Sep 30 18:36:04 compute-1 sshd-session[292004]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:36:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:05.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:05 compute-1 ceph-mon[75484]: pgmap v1725: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:36:05 compute-1 podman[249638]: time="2025-09-30T18:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:36:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:36:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8356 "" "Go-http-client/1.1"
Sep 30 18:36:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:05.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:06 compute-1 sshd-session[292004]: Failed password for root from 103.153.190.105 port 60277 ssh2
Sep 30 18:36:06 compute-1 podman[292011]: 2025-09-30 18:36:06.537483653 +0000 UTC m=+0.077615749 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public)
Sep 30 18:36:06 compute-1 podman[292012]: 2025-09-30 18:36:06.547029481 +0000 UTC m=+0.083700413 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:36:06 compute-1 podman[292010]: 2025-09-30 18:36:06.578136662 +0000 UTC m=+0.121589377 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:36:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:07 compute-1 sshd-session[292004]: Received disconnect from 103.153.190.105 port 60277:11: Bye Bye [preauth]
Sep 30 18:36:07 compute-1 sshd-session[292004]: Disconnected from authenticating user root 103.153.190.105 port 60277 [preauth]
Sep 30 18:36:07 compute-1 ceph-mon[75484]: pgmap v1726: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.8 KiB/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:36:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:36:07 compute-1 nova_compute[238822]: 2025-09-30 18:36:07.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:07.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:09 compute-1 nova_compute[238822]: 2025-09-30 18:36:09.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:09 compute-1 ovn_controller[135204]: 2025-09-30T18:36:09Z|00218|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Sep 30 18:36:09 compute-1 ceph-mon[75484]: pgmap v1727: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.8 KiB/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:36:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:09.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:11 compute-1 ceph-mon[75484]: pgmap v1728: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 2.1 KiB/s rd, 16 KiB/s wr, 2 op/s
Sep 30 18:36:11 compute-1 sshd-session[292073]: Invalid user seekcy from 14.225.167.110 port 41720
Sep 30 18:36:11 compute-1 sshd-session[292073]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:36:11 compute-1 sshd-session[292073]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:36:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:11.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.356 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Creating tmpfile /var/lib/nova/instances/tmpufrp_vl0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.358 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.364 2 DEBUG nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpufrp_vl0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.577 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Creating tmpfile /var/lib/nova/instances/tmpjbxj26bw to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.578 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:12 compute-1 nova_compute[238822]: 2025-09-30 18:36:12.582 2 DEBUG nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjbxj26bw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:36:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:13.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:13 compute-1 ceph-mon[75484]: pgmap v1729: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:36:13 compute-1 sshd-session[292073]: Failed password for invalid user seekcy from 14.225.167.110 port 41720 ssh2
Sep 30 18:36:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:13.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:14 compute-1 nova_compute[238822]: 2025-09-30 18:36:14.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:14 compute-1 nova_compute[238822]: 2025-09-30 18:36:14.411 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:14 compute-1 nova_compute[238822]: 2025-09-30 18:36:14.612 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:15.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:15 compute-1 ceph-mon[75484]: pgmap v1730: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 17 KiB/s wr, 1 op/s
Sep 30 18:36:15 compute-1 sshd-session[292073]: Received disconnect from 14.225.167.110 port 41720:11: Bye Bye [preauth]
Sep 30 18:36:15 compute-1 sshd-session[292073]: Disconnected from invalid user seekcy 14.225.167.110 port 41720 [preauth]
Sep 30 18:36:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:15.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:17 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:36:17 compute-1 sudo[292085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:36:17 compute-1 sudo[292085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:17 compute-1 sudo[292085]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:17 compute-1 sudo[292110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:36:17 compute-1 sudo[292110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:17 compute-1 nova_compute[238822]: 2025-09-30 18:36:17.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:17 compute-1 ceph-mon[75484]: pgmap v1731: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:36:17 compute-1 sudo[292135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:36:17 compute-1 sudo[292135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:17 compute-1 sudo[292135]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:36:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:17.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:36:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:18 compute-1 nova_compute[238822]: 2025-09-30 18:36:18.237 2 DEBUG nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpufrp_vl0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='741d9cb1-7a49-4d89-8b1a-78ae947f2c49',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:36:18 compute-1 sudo[292110]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:36:18 compute-1 ceph-mon[75484]: pgmap v1732: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.0 KiB/s wr, 0 op/s
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:36:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:36:18 compute-1 unix_chkpwd[292194]: password check failed for user (root)
Sep 30 18:36:18 compute-1 sshd-session[292081]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:36:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:19.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:19 compute-1 nova_compute[238822]: 2025-09-30 18:36:19.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:19 compute-1 nova_compute[238822]: 2025-09-30 18:36:19.256 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:36:19 compute-1 nova_compute[238822]: 2025-09-30 18:36:19.256 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:36:19 compute-1 nova_compute[238822]: 2025-09-30 18:36:19.257 2 DEBUG nova.network.neutron [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: ERROR   18:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: ERROR   18:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: ERROR   18:36:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: ERROR   18:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: ERROR   18:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:36:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:36:19 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:36:19 compute-1 nova_compute[238822]: 2025-09-30 18:36:19.765 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:20.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:20 compute-1 sshd-session[292081]: Failed password for root from 192.210.160.141 port 59864 ssh2
Sep 30 18:36:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:21.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.072193) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381072297, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1634, "num_deletes": 251, "total_data_size": 3879429, "memory_usage": 3931120, "flush_reason": "Manual Compaction"}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381087276, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2504292, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46659, "largest_seqno": 48288, "table_properties": {"data_size": 2497533, "index_size": 3830, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 13635, "raw_average_key_size": 18, "raw_value_size": 2483952, "raw_average_value_size": 3416, "num_data_blocks": 169, "num_entries": 727, "num_filter_entries": 727, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257245, "oldest_key_time": 1759257245, "file_creation_time": 1759257381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 15143 microseconds, and 8198 cpu microseconds.
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.087347) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2504292 bytes OK
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.087376) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.089744) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.089757) EVENT_LOG_v1 {"time_micros": 1759257381089752, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.089777) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 3871856, prev total WAL file size 3871856, number of live WAL files 2.
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.090989) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2445KB)], [90(12MB)]
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381091107, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 16035162, "oldest_snapshot_seqno": -1}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7153 keys, 14628692 bytes, temperature: kUnknown
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381201499, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 14628692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14581153, "index_size": 28547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17925, "raw_key_size": 186035, "raw_average_key_size": 26, "raw_value_size": 14453032, "raw_average_value_size": 2020, "num_data_blocks": 1132, "num_entries": 7153, "num_filter_entries": 7153, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.202022) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 14628692 bytes
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.203587) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.3 rd, 132.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.9 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(12.2) write-amplify(5.8) OK, records in: 7669, records dropped: 516 output_compression: NoCompression
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.203655) EVENT_LOG_v1 {"time_micros": 1759257381203604, "job": 56, "event": "compaction_finished", "compaction_time_micros": 110365, "compaction_time_cpu_micros": 58947, "output_level": 6, "num_output_files": 1, "total_output_size": 14628692, "num_input_records": 7669, "num_output_records": 7153, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381204651, "job": 56, "event": "table_file_deletion", "file_number": 92}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257381209286, "job": 56, "event": "table_file_deletion", "file_number": 90}
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.090788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.209402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.209412) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.209415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.209418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:21.209421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:21 compute-1 ceph-mon[75484]: pgmap v1733: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 10 KiB/s wr, 2 op/s
Sep 30 18:36:21 compute-1 nova_compute[238822]: 2025-09-30 18:36:21.738 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:21 compute-1 sshd-session[292081]: Connection closed by authenticating user root 192.210.160.141 port 59864 [preauth]
Sep 30 18:36:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:22.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:36:22 compute-1 nova_compute[238822]: 2025-09-30 18:36:22.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:22 compute-1 nova_compute[238822]: 2025-09-30 18:36:22.743 2 DEBUG nova.network.neutron [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Updating instance_info_cache with network_info: [{"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:36:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:23.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.251 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.272 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpufrp_vl0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='741d9cb1-7a49-4d89-8b1a-78ae947f2c49',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.273 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Creating instance directory: /var/lib/nova/instances/741d9cb1-7a49-4d89-8b1a-78ae947f2c49 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.273 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Ensure instance console log exists: /var/lib/nova/instances/741d9cb1-7a49-4d89-8b1a-78ae947f2c49/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.274 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.274 2 DEBUG nova.virt.libvirt.vif [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1259001093',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1259001093',id=27,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:35:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-9g37rry3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:35:42Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=741d9cb1-7a49-4d89-8b1a-78ae947f2c49,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.275 2 DEBUG nova.network.os_vif_util [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.275 2 DEBUG nova.network.os_vif_util [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.276 2 DEBUG os_vif [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.280 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '92c5bf17-6cf2-5fdf-90b2-77d036c7342b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23538fed-fc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap23538fed-fc, col_values=(('qos', UUID('a31b2041-b777-4968-af2b-d4d0a8f2cf1b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap23538fed-fc, col_values=(('external_ids', {'iface-id': '23538fed-fc3c-4080-bbea-55e12668af3b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:4f:5c', 'vm-uuid': '741d9cb1-7a49-4d89-8b1a-78ae947f2c49'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 NetworkManager[45549]: <info>  [1759257383.2934] manager: (tap23538fed-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.301 2 INFO os_vif [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc')
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.302 2 DEBUG nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.302 2 DEBUG nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpufrp_vl0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='741d9cb1-7a49-4d89-8b1a-78ae947f2c49',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.303 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:23 compute-1 nova_compute[238822]: 2025-09-30 18:36:23.384 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:23 compute-1 ceph-mon[75484]: pgmap v1734: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:36:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:36:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:36:23 compute-1 sudo[292204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:36:23 compute-1 sudo[292204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:23 compute-1 sudo[292204]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:24.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:24 compute-1 nova_compute[238822]: 2025-09-30 18:36:24.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:24 compute-1 podman[292231]: 2025-09-30 18:36:24.547484613 +0000 UTC m=+0.082357228 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:36:24 compute-1 podman[292230]: 2025-09-30 18:36:24.597089543 +0000 UTC m=+0.142245456 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Sep 30 18:36:24 compute-1 nova_compute[238822]: 2025-09-30 18:36:24.698 2 DEBUG nova.network.neutron [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Port 23538fed-fc3c-4080-bbea-55e12668af3b updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:36:24 compute-1 nova_compute[238822]: 2025-09-30 18:36:24.714 2 DEBUG nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpufrp_vl0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='741d9cb1-7a49-4d89-8b1a-78ae947f2c49',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:36:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:25.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:25 compute-1 ceph-mon[75484]: pgmap v1735: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:36:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:26.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:26 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:36:26 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:36:26 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:36:26 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:36:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:27.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:27 compute-1 kernel: tap23538fed-fc: entered promiscuous mode
Sep 30 18:36:27 compute-1 ovn_controller[135204]: 2025-09-30T18:36:27Z|00219|binding|INFO|Claiming lport 23538fed-fc3c-4080-bbea-55e12668af3b for this additional chassis.
Sep 30 18:36:27 compute-1 ovn_controller[135204]: 2025-09-30T18:36:27Z|00220|binding|INFO|23538fed-fc3c-4080-bbea-55e12668af3b: Claiming fa:16:3e:1a:4f:5c 10.100.0.10
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.0678] manager: (tap23538fed-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.075 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:4f:5c 10.100.0.10'], port_security=['fa:16:3e:1a:4f:5c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '741d9cb1-7a49-4d89-8b1a-78ae947f2c49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=23538fed-fc3c-4080-bbea-55e12668af3b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.076 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 23538fed-fc3c-4080-bbea-55e12668af3b in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.079 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:36:27 compute-1 ovn_controller[135204]: 2025-09-30T18:36:27Z|00221|binding|INFO|Setting lport 23538fed-fc3c-4080-bbea-55e12668af3b ovn-installed in OVS
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.104 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f6553b4c-8756-4418-96d0-1a25991edb7c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.105 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6901f664-31 in ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.108 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6901f664-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.108 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7a82e23b-ee02-43cc-8331-3e4f21485514]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.110 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5d60fde7-96a7-4ff7-9c6d-9ae9013f798c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 systemd-udevd[292333]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:36:27 compute-1 systemd-machined[195911]: New machine qemu-20-instance-0000001b.
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.138 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[e244efb3-c018-4ee9-a9a6-eed17242d76b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.1423] device (tap23538fed-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.1431] device (tap23538fed-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:36:27 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-0000001b.
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.160 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[018e25f4-f0e1-4c59-b5e9-40bbea6db54d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.210 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[322ffe1d-cf5a-4237-b964-edc7d0da9b32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.217 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[76189862-e1d8-4310-bdda-15bca33ae4b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 systemd-udevd[292337]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.2193] manager: (tap6901f664-30): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.270 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd40888-ce6d-4166-a7f1-38d69e121b2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.276 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[897adb36-c5c0-4dea-be67-4fc8ba5cdf75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.3107] device (tap6901f664-30): carrier: link connected
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.323 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[fbce1ec7-3f67-41c1-a17b-bfc5e4c4952b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.351 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4174749e-42fa-40be-8e11-2bc1c2aaf8ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1522889, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292365, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.379 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e62ba867-e216-4d17-b732-5730f7909d9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:412a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1522889, 'tstamp': 1522889}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292366, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.403 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a7deaefa-cd1a-4311-9ef0-bbefda979779]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1522889, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 292367, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.454 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3eb032-5276-4acd-b17d-0ae04f0693ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ceph-mon[75484]: pgmap v1736: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.537 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[786e468b-5ea8-449a-9086-7937e5e98350]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.539 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.540 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.540 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:27 compute-1 kernel: tap6901f664-30: entered promiscuous mode
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 NetworkManager[45549]: <info>  [1759257387.5445] manager: (tap6901f664-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.549 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 ovn_controller[135204]: 2025-09-30T18:36:27Z|00222|binding|INFO|Releasing lport 5b6cbf18-1826-41d0-920f-e9db4f1a1832 from this chassis (sb_readonly=0)
Sep 30 18:36:27 compute-1 nova_compute[238822]: 2025-09-30 18:36:27.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.580 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[42135084-6b03-4f40-ba8f-d258c141d874]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.581 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.581 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.582 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 6901f664-336b-42d2-bbf7-58951befc8d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.582 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.583 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[66971b82-4cc5-40e6-95db-da32fccaaf52]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.584 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.585 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7b3dbe-ec56-41ba-9e5d-2622fe36dc62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.585 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:36:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:27.587 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'env', 'PROCESS_TAG=haproxy-6901f664-336b-42d2-bbf7-58951befc8d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6901f664-336b-42d2-bbf7-58951befc8d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:36:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:28.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:28 compute-1 podman[292441]: 2025-09-30 18:36:28.047822699 +0000 UTC m=+0.030693290 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:36:28 compute-1 nova_compute[238822]: 2025-09-30 18:36:28.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:28 compute-1 podman[292441]: 2025-09-30 18:36:28.454841681 +0000 UTC m=+0.437712272 container create 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 18:36:28 compute-1 systemd[1]: Started libpod-conmon-1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757.scope.
Sep 30 18:36:28 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:36:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0772e2b18e3975d0f1277cc80688bf4685bbe30af3143d69f4936ba656426ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:36:28 compute-1 podman[292441]: 2025-09-30 18:36:28.680131161 +0000 UTC m=+0.663001802 container init 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:36:28 compute-1 podman[292441]: 2025-09-30 18:36:28.691163659 +0000 UTC m=+0.674034260 container start 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:36:28 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [NOTICE]   (292461) : New worker (292463) forked
Sep 30 18:36:28 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [NOTICE]   (292461) : Loading success.
Sep 30 18:36:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:29.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:29 compute-1 nova_compute[238822]: 2025-09-30 18:36:29.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:29 compute-1 ceph-mon[75484]: pgmap v1737: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:36:29 compute-1 nova_compute[238822]: 2025-09-30 18:36:29.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:29.743 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:36:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:29.745 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:36:29 compute-1 ovn_controller[135204]: 2025-09-30T18:36:29Z|00223|binding|INFO|Claiming lport 23538fed-fc3c-4080-bbea-55e12668af3b for this chassis.
Sep 30 18:36:29 compute-1 ovn_controller[135204]: 2025-09-30T18:36:29Z|00224|binding|INFO|23538fed-fc3c-4080-bbea-55e12668af3b: Claiming fa:16:3e:1a:4f:5c 10.100.0.10
Sep 30 18:36:29 compute-1 ovn_controller[135204]: 2025-09-30T18:36:29Z|00225|binding|INFO|Setting lport 23538fed-fc3c-4080-bbea-55e12668af3b up in Southbound
Sep 30 18:36:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:30.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:30 compute-1 nova_compute[238822]: 2025-09-30 18:36:30.881 2 INFO nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Post operation of migration started
Sep 30 18:36:30 compute-1 nova_compute[238822]: 2025-09-30 18:36:30.882 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:31.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:31 compute-1 nova_compute[238822]: 2025-09-30 18:36:31.550 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:31 compute-1 nova_compute[238822]: 2025-09-30 18:36:31.551 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:31 compute-1 ceph-mon[75484]: pgmap v1738: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 10 KiB/s wr, 7 op/s
Sep 30 18:36:31 compute-1 podman[292476]: 2025-09-30 18:36:31.576495072 +0000 UTC m=+0.102188933 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Sep 30 18:36:31 compute-1 nova_compute[238822]: 2025-09-30 18:36:31.648 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:36:31 compute-1 nova_compute[238822]: 2025-09-30 18:36:31.649 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:36:31 compute-1 nova_compute[238822]: 2025-09-30 18:36:31.649 2 DEBUG nova.network.neutron [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:36:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:32.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:32 compute-1 nova_compute[238822]: 2025-09-30 18:36:32.157 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:32 compute-1 nova_compute[238822]: 2025-09-30 18:36:32.951 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:36:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:33.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:36:33 compute-1 nova_compute[238822]: 2025-09-30 18:36:33.119 2 DEBUG nova.network.neutron [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Updating instance_info_cache with network_info: [{"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:36:33 compute-1 nova_compute[238822]: 2025-09-30 18:36:33.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:33 compute-1 ceph-mon[75484]: pgmap v1739: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 2.3 KiB/s wr, 5 op/s
Sep 30 18:36:33 compute-1 nova_compute[238822]: 2025-09-30 18:36:33.626 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-741d9cb1-7a49-4d89-8b1a-78ae947f2c49" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:36:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:34.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:34 compute-1 nova_compute[238822]: 2025-09-30 18:36:34.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:34 compute-1 nova_compute[238822]: 2025-09-30 18:36:34.145 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:36:34 compute-1 nova_compute[238822]: 2025-09-30 18:36:34.145 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:36:34 compute-1 nova_compute[238822]: 2025-09-30 18:36:34.146 2 DEBUG oslo_concurrency.lockutils [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:36:34 compute-1 nova_compute[238822]: 2025-09-30 18:36:34.152 2 INFO nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:36:34 compute-1 virtqemud[239124]: Domain id=20 name='instance-0000001b' uuid=741d9cb1-7a49-4d89-8b1a-78ae947f2c49 is tainted: custom-monitor
Sep 30 18:36:34 compute-1 ceph-mon[75484]: pgmap v1740: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 2.3 KiB/s wr, 6 op/s
Sep 30 18:36:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:35.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:35 compute-1 nova_compute[238822]: 2025-09-30 18:36:35.163 2 INFO nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:36:35 compute-1 podman[249638]: time="2025-09-30T18:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:36:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:36:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8822 "" "Go-http-client/1.1"
Sep 30 18:36:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:36.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.083455) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396083541, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 405, "num_deletes": 251, "total_data_size": 488630, "memory_usage": 496256, "flush_reason": "Manual Compaction"}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396088690, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 305175, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48293, "largest_seqno": 48693, "table_properties": {"data_size": 302795, "index_size": 480, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6329, "raw_average_key_size": 20, "raw_value_size": 298082, "raw_average_value_size": 964, "num_data_blocks": 20, "num_entries": 309, "num_filter_entries": 309, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257381, "oldest_key_time": 1759257381, "file_creation_time": 1759257396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 5277 microseconds, and 2735 cpu microseconds.
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.088745) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 305175 bytes OK
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.088770) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.090516) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.090537) EVENT_LOG_v1 {"time_micros": 1759257396090530, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.090561) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 486024, prev total WAL file size 486024, number of live WAL files 2.
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.091403) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353034' seq:72057594037927935, type:22 .. '6D6772737461740031373536' seq:0, type:0; will stop at (end)
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(298KB)], [93(13MB)]
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396091502, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14933867, "oldest_snapshot_seqno": -1}
Sep 30 18:36:36 compute-1 nova_compute[238822]: 2025-09-30 18:36:36.172 2 INFO nova.virt.libvirt.driver [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:36:36 compute-1 nova_compute[238822]: 2025-09-30 18:36:36.181 2 DEBUG nova.compute.manager [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 6950 keys, 11092359 bytes, temperature: kUnknown
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396183412, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 11092359, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11050905, "index_size": 22974, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 181991, "raw_average_key_size": 26, "raw_value_size": 10931092, "raw_average_value_size": 1572, "num_data_blocks": 899, "num_entries": 6950, "num_filter_entries": 6950, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.183827) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 11092359 bytes
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.185619) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.3 rd, 120.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.0 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(85.3) write-amplify(36.3) OK, records in: 7462, records dropped: 512 output_compression: NoCompression
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.185675) EVENT_LOG_v1 {"time_micros": 1759257396185658, "job": 58, "event": "compaction_finished", "compaction_time_micros": 92011, "compaction_time_cpu_micros": 53607, "output_level": 6, "num_output_files": 1, "total_output_size": 11092359, "num_input_records": 7462, "num_output_records": 6950, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396185945, "job": 58, "event": "table_file_deletion", "file_number": 95}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257396190377, "job": 58, "event": "table_file_deletion", "file_number": 93}
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.091236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.190443) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.190453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.190456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.190459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:36:36.190462) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:36:36 compute-1 nova_compute[238822]: 2025-09-30 18:36:36.697 2 DEBUG nova.objects.instance [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:36:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:37.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:37 compute-1 ceph-mon[75484]: pgmap v1741: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.3 KiB/s wr, 6 op/s
Sep 30 18:36:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1701818751' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:36:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1701818751' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:36:37 compute-1 podman[292502]: 2025-09-30 18:36:37.549672031 +0000 UTC m=+0.087688851 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:36:37 compute-1 podman[292503]: 2025-09-30 18:36:37.592384596 +0000 UTC m=+0.124886747 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Sep 30 18:36:37 compute-1 podman[292504]: 2025-09-30 18:36:37.605933942 +0000 UTC m=+0.133133420 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20250930)
Sep 30 18:36:37 compute-1 nova_compute[238822]: 2025-09-30 18:36:37.720 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:37 compute-1 sudo[292560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:36:37 compute-1 sudo[292560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:37 compute-1 sudo[292560]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:38.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:36:38 compute-1 nova_compute[238822]: 2025-09-30 18:36:38.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:38 compute-1 nova_compute[238822]: 2025-09-30 18:36:38.597 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:38 compute-1 nova_compute[238822]: 2025-09-30 18:36:38.598 2 WARNING neutronclient.v2_0.client [None req-1f1ec937-23ba-4f7b-a3e8-b3b26a415b42 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:38.750 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:39.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:39 compute-1 nova_compute[238822]: 2025-09-30 18:36:39.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:39 compute-1 ceph-mon[75484]: pgmap v1742: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.3 KiB/s wr, 6 op/s
Sep 30 18:36:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:40.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/940849169' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:41.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:41 compute-1 ceph-mon[75484]: pgmap v1743: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 2.3 KiB/s wr, 6 op/s
Sep 30 18:36:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:42.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:43.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:43 compute-1 nova_compute[238822]: 2025-09-30 18:36:43.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:43 compute-1 ceph-mon[75484]: pgmap v1744: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:36:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:44.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:44 compute-1 nova_compute[238822]: 2025-09-30 18:36:44.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:44 compute-1 nova_compute[238822]: 2025-09-30 18:36:44.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:36:44 compute-1 nova_compute[238822]: 2025-09-30 18:36:44.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1619482483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:45 compute-1 ceph-mon[75484]: pgmap v1745: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:46.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:46 compute-1 unix_chkpwd[292596]: password check failed for user (root)
Sep 30 18:36:46 compute-1 sshd-session[292592]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:36:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:47.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.410 2 DEBUG nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjbxj26bw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='342a3981-de33-491a-974b-5566045fba97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:36:47 compute-1 ceph-mon[75484]: pgmap v1746: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:36:47 compute-1 nova_compute[238822]: 2025-09-30 18:36:47.573 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:36:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:36:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4262819077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:48.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.056 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:48 compute-1 sshd-session[292592]: Failed password for root from 192.210.160.141 port 47432 ssh2
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.428 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.428 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.429 2 DEBUG nova.network.neutron [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:36:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4262819077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:48 compute-1 nova_compute[238822]: 2025-09-30 18:36:48.937 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:49.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.102 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.103 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.321 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.323 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.358 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.359 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4502MB free_disk=39.9011344909668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.359 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.360 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:36:49 compute-1 sshd-session[292592]: Connection closed by authenticating user root 192.210.160.141 port 47432 [preauth]
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: ERROR   18:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: ERROR   18:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: ERROR   18:36:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: ERROR   18:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: ERROR   18:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:36:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:36:49 compute-1 ceph-mon[75484]: pgmap v1747: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1350924747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.764 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:49 compute-1 nova_compute[238822]: 2025-09-30 18:36:49.977 2 DEBUG nova.network.neutron [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Updating instance_info_cache with network_info: [{"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:36:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:50.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.484 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.501 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjbxj26bw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='342a3981-de33-491a-974b-5566045fba97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.502 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Creating instance directory: /var/lib/nova/instances/342a3981-de33-491a-974b-5566045fba97 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.502 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Ensure instance console log exists: /var/lib/nova/instances/342a3981-de33-491a-974b-5566045fba97/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.503 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.505 2 DEBUG nova.virt.libvirt.vif [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2049754641',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2049754641',id=26,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:35:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ryu3h0vp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:35:15Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=342a3981-de33-491a-974b-5566045fba97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.505 2 DEBUG nova.network.os_vif_util [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.506 2 DEBUG nova.network.os_vif_util [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.507 2 DEBUG os_vif [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.509 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.511 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4fd2d94c-8564-5ce6-b996-1a987f922b83', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf05039eb-b7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf05039eb-b7, col_values=(('qos', UUID('168b4cbe-8585-4f7f-948d-b951417a6be9')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.521 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf05039eb-b7, col_values=(('external_ids', {'iface-id': 'f05039eb-b7e1-4072-bc17-63c6787538a1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:d4:b5', 'vm-uuid': '342a3981-de33-491a-974b-5566045fba97'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 NetworkManager[45549]: <info>  [1759257410.5256] manager: (tapf05039eb-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.535 2 INFO os_vif [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7')
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.536 2 DEBUG nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.536 2 DEBUG nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjbxj26bw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='342a3981-de33-491a-974b-5566045fba97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.537 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:50 compute-1 nova_compute[238822]: 2025-09-30 18:36:50.892 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 342a3981-de33-491a-974b-5566045fba97 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:36:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:51.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:51 compute-1 nova_compute[238822]: 2025-09-30 18:36:51.404 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 342a3981-de33-491a-974b-5566045fba97] Updating resource usage from migration 3b051e9f-cf53-41ea-a9f1-d01148892c63
Sep 30 18:36:51 compute-1 nova_compute[238822]: 2025-09-30 18:36:51.404 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 342a3981-de33-491a-974b-5566045fba97] Starting to track incoming migration 3b051e9f-cf53-41ea-a9f1-d01148892c63 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:36:51 compute-1 ceph-mon[75484]: pgmap v1748: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:51 compute-1 nova_compute[238822]: 2025-09-30 18:36:51.583 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:36:51 compute-1 nova_compute[238822]: 2025-09-30 18:36:51.951 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 741d9cb1-7a49-4d89-8b1a-78ae947f2c49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:36:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:52.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:52 compute-1 nova_compute[238822]: 2025-09-30 18:36:52.461 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 342a3981-de33-491a-974b-5566045fba97 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:36:52 compute-1 nova_compute[238822]: 2025-09-30 18:36:52.462 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:36:52 compute-1 nova_compute[238822]: 2025-09-30 18:36:52.462 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:36:49 up  4:14,  0 user,  load average: 0.47, 0.42, 0.57\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_c634e1c17ed54907969576a0eb8eff50': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:36:52 compute-1 nova_compute[238822]: 2025-09-30 18:36:52.519 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:36:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2468757023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:36:52 compute-1 unix_chkpwd[292633]: password check failed for user (root)
Sep 30 18:36:52 compute-1 sshd-session[292629]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:36:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:36:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2102236554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:53 compute-1 nova_compute[238822]: 2025-09-30 18:36:53.002 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:36:53 compute-1 nova_compute[238822]: 2025-09-30 18:36:53.010 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:36:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:53.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:53 compute-1 nova_compute[238822]: 2025-09-30 18:36:53.522 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:36:53 compute-1 ceph-mon[75484]: pgmap v1749: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 852 B/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2102236554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:36:53 compute-1 nova_compute[238822]: 2025-09-30 18:36:53.570 2 DEBUG nova.network.neutron [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Port f05039eb-b7e1-4072-bc17-63c6787538a1 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:36:53 compute-1 nova_compute[238822]: 2025-09-30 18:36:53.584 2 DEBUG nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjbxj26bw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='342a3981-de33-491a-974b-5566045fba97',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:36:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:54 compute-1 nova_compute[238822]: 2025-09-30 18:36:54.034 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:36:54 compute-1 nova_compute[238822]: 2025-09-30 18:36:54.035 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.675s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:36:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:54.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:54 compute-1 nova_compute[238822]: 2025-09-30 18:36:54.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:54.403 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:36:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:54.403 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:36:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:54.404 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:36:54 compute-1 sshd-session[292629]: Failed password for root from 8.243.64.201 port 35584 ssh2
Sep 30 18:36:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:55.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:55 compute-1 sshd-session[292629]: Received disconnect from 8.243.64.201 port 35584:11: Bye Bye [preauth]
Sep 30 18:36:55 compute-1 sshd-session[292629]: Disconnected from authenticating user root 8.243.64.201 port 35584 [preauth]
Sep 30 18:36:55 compute-1 nova_compute[238822]: 2025-09-30 18:36:55.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:55 compute-1 podman[292660]: 2025-09-30 18:36:55.555555404 +0000 UTC m=+0.089668104 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:36:55 compute-1 ceph-mon[75484]: pgmap v1750: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:36:55 compute-1 podman[292659]: 2025-09-30 18:36:55.599944574 +0000 UTC m=+0.137445886 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:36:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:56.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:56 compute-1 kernel: tapf05039eb-b7: entered promiscuous mode
Sep 30 18:36:56 compute-1 NetworkManager[45549]: <info>  [1759257416.1904] manager: (tapf05039eb-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Sep 30 18:36:56 compute-1 ovn_controller[135204]: 2025-09-30T18:36:56Z|00226|binding|INFO|Claiming lport f05039eb-b7e1-4072-bc17-63c6787538a1 for this additional chassis.
Sep 30 18:36:56 compute-1 ovn_controller[135204]: 2025-09-30T18:36:56Z|00227|binding|INFO|f05039eb-b7e1-4072-bc17-63c6787538a1: Claiming fa:16:3e:84:d4:b5 10.100.0.7
Sep 30 18:36:56 compute-1 nova_compute[238822]: 2025-09-30 18:36:56.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.205 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:d4:b5 10.100.0.7'], port_security=['fa:16:3e:84:d4:b5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '342a3981-de33-491a-974b-5566045fba97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '10', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f05039eb-b7e1-4072-bc17-63c6787538a1) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.207 144543 INFO neutron.agent.ovn.metadata.agent [-] Port f05039eb-b7e1-4072-bc17-63c6787538a1 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.209 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:36:56 compute-1 ovn_controller[135204]: 2025-09-30T18:36:56Z|00228|binding|INFO|Setting lport f05039eb-b7e1-4072-bc17-63c6787538a1 ovn-installed in OVS
Sep 30 18:36:56 compute-1 nova_compute[238822]: 2025-09-30 18:36:56.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:56 compute-1 nova_compute[238822]: 2025-09-30 18:36:56.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.233 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a215fda4-c93e-4178-bf11-7d8aa834c149]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 systemd-machined[195911]: New machine qemu-21-instance-0000001a.
Sep 30 18:36:56 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-0000001a.
Sep 30 18:36:56 compute-1 systemd-udevd[292726]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.282 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[f39988b9-1f26-45d8-b928-6fe1997acb50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.288 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[95c01af2-16f8-40f9-8be2-28dd14534c60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 NetworkManager[45549]: <info>  [1759257416.2979] device (tapf05039eb-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:36:56 compute-1 NetworkManager[45549]: <info>  [1759257416.3004] device (tapf05039eb-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.336 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7892be01-a075-42c9-9824-14ca3b68adb9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.361 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7f95c7f7-d9ff-44ba-9dc1-4cb135a4b7e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1522889, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292734, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.394 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[75806a40-d139-48fb-b61a-ee081fa00d60]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1522906, 'tstamp': 1522906}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292737, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1522910, 'tstamp': 1522910}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292737, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.396 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:56 compute-1 nova_compute[238822]: 2025-09-30 18:36:56.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.445 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.446 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.446 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.446 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:36:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:36:56.448 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[79bb5f41-9ca6-4d96-9e33-f1dd8de4f9bc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:36:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:57 compute-1 nova_compute[238822]: 2025-09-30 18:36:57.036 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:57 compute-1 nova_compute[238822]: 2025-09-30 18:36:57.037 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:57 compute-1 nova_compute[238822]: 2025-09-30 18:36:57.037 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:57 compute-1 nova_compute[238822]: 2025-09-30 18:36:57.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:57 compute-1 nova_compute[238822]: 2025-09-30 18:36:57.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:36:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:57.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:57 compute-1 ceph-mon[75484]: pgmap v1751: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:36:57 compute-1 sudo[292782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:36:57 compute-1 sudo[292782]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:36:57 compute-1 sudo[292782]: pam_unix(sudo:session): session closed for user root
Sep 30 18:36:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:36:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:36:58.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:36:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1471880056' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:36:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1471880056' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:36:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:36:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:36:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:36:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:36:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:36:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:36:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:36:59.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:36:59 compute-1 nova_compute[238822]: 2025-09-30 18:36:59.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:36:59 compute-1 ceph-mon[75484]: pgmap v1752: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:00.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:00 compute-1 nova_compute[238822]: 2025-09-30 18:37:00.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:00 compute-1 ovn_controller[135204]: 2025-09-30T18:37:00Z|00229|binding|INFO|Claiming lport f05039eb-b7e1-4072-bc17-63c6787538a1 for this chassis.
Sep 30 18:37:00 compute-1 ovn_controller[135204]: 2025-09-30T18:37:00Z|00230|binding|INFO|f05039eb-b7e1-4072-bc17-63c6787538a1: Claiming fa:16:3e:84:d4:b5 10.100.0.7
Sep 30 18:37:00 compute-1 ovn_controller[135204]: 2025-09-30T18:37:00Z|00231|binding|INFO|Setting lport f05039eb-b7e1-4072-bc17-63c6787538a1 up in Southbound
Sep 30 18:37:00 compute-1 ceph-mon[75484]: pgmap v1753: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 170 B/s wr, 5 op/s
Sep 30 18:37:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:01.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:01 compute-1 nova_compute[238822]: 2025-09-30 18:37:01.709 2 INFO nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Post operation of migration started
Sep 30 18:37:01 compute-1 nova_compute[238822]: 2025-09-30 18:37:01.710 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:02.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:02 compute-1 podman[292812]: 2025-09-30 18:37:02.572238159 +0000 UTC m=+0.099454619 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:37:02 compute-1 nova_compute[238822]: 2025-09-30 18:37:02.609 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:02 compute-1 nova_compute[238822]: 2025-09-30 18:37:02.610 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:02 compute-1 nova_compute[238822]: 2025-09-30 18:37:02.714 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:37:02 compute-1 nova_compute[238822]: 2025-09-30 18:37:02.714 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:37:02 compute-1 nova_compute[238822]: 2025-09-30 18:37:02.715 2 DEBUG nova.network.neutron [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:37:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:37:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:03.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:37:03 compute-1 nova_compute[238822]: 2025-09-30 18:37:03.223 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:03 compute-1 ceph-mon[75484]: pgmap v1754: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 170 B/s wr, 5 op/s
Sep 30 18:37:03 compute-1 nova_compute[238822]: 2025-09-30 18:37:03.994 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:04.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:04 compute-1 nova_compute[238822]: 2025-09-30 18:37:04.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:04 compute-1 nova_compute[238822]: 2025-09-30 18:37:04.133 2 DEBUG nova.network.neutron [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Updating instance_info_cache with network_info: [{"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:37:04 compute-1 nova_compute[238822]: 2025-09-30 18:37:04.642 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-342a3981-de33-491a-974b-5566045fba97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:37:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:05.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:05 compute-1 nova_compute[238822]: 2025-09-30 18:37:05.170 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:05 compute-1 nova_compute[238822]: 2025-09-30 18:37:05.171 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:05 compute-1 nova_compute[238822]: 2025-09-30 18:37:05.171 2 DEBUG oslo_concurrency.lockutils [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:05 compute-1 nova_compute[238822]: 2025-09-30 18:37:05.178 2 INFO nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:37:05 compute-1 virtqemud[239124]: Domain id=21 name='instance-0000001a' uuid=342a3981-de33-491a-974b-5566045fba97 is tainted: custom-monitor
Sep 30 18:37:05 compute-1 ceph-mon[75484]: pgmap v1755: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:37:05 compute-1 nova_compute[238822]: 2025-09-30 18:37:05.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:05 compute-1 podman[249638]: time="2025-09-30T18:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:37:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:37:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8823 "" "Go-http-client/1.1"
Sep 30 18:37:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:06.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:06 compute-1 nova_compute[238822]: 2025-09-30 18:37:06.192 2 INFO nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:37:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:07.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:07 compute-1 nova_compute[238822]: 2025-09-30 18:37:07.200 2 INFO nova.virt.libvirt.driver [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:37:07 compute-1 nova_compute[238822]: 2025-09-30 18:37:07.208 2 DEBUG nova.compute.manager [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:37:07 compute-1 ceph-mon[75484]: pgmap v1756: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:37:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:37:07 compute-1 nova_compute[238822]: 2025-09-30 18:37:07.721 2 DEBUG nova.objects.instance [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:37:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:08.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:08 compute-1 podman[292839]: 2025-09-30 18:37:08.55594957 +0000 UTC m=+0.085121992 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Sep 30 18:37:08 compute-1 podman[292846]: 2025-09-30 18:37:08.570920135 +0000 UTC m=+0.080529518 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Sep 30 18:37:08 compute-1 podman[292840]: 2025-09-30 18:37:08.595114029 +0000 UTC m=+0.117424595 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 18:37:08 compute-1 nova_compute[238822]: 2025-09-30 18:37:08.746 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:09.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:09 compute-1 nova_compute[238822]: 2025-09-30 18:37:09.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:09 compute-1 nova_compute[238822]: 2025-09-30 18:37:09.484 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:09 compute-1 nova_compute[238822]: 2025-09-30 18:37:09.485 2 WARNING neutronclient.v2_0.client [None req-1f634a2c-f1ad-4186-836f-397e55ef424a 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:09 compute-1 ceph-mon[75484]: pgmap v1757: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:37:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:10.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:10 compute-1 nova_compute[238822]: 2025-09-30 18:37:10.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:11.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:11 compute-1 ceph-mon[75484]: pgmap v1758: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 170 B/s wr, 6 op/s
Sep 30 18:37:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/633928403' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:37:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:12.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:37:12 compute-1 sshd-session[292898]: Invalid user ftpuser1 from 192.210.160.141 port 59946
Sep 30 18:37:12 compute-1 sshd-session[292898]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:37:12 compute-1 sshd-session[292898]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.528 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.529 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.529 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.529 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.530 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:12 compute-1 nova_compute[238822]: 2025-09-30 18:37:12.544 2 INFO nova.compute.manager [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Terminating instance
Sep 30 18:37:12 compute-1 ceph-mon[75484]: pgmap v1759: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:37:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.064 2 DEBUG nova.compute.manager [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:37:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:13.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:13 compute-1 kernel: tap23538fed-fc (unregistering): left promiscuous mode
Sep 30 18:37:13 compute-1 NetworkManager[45549]: <info>  [1759257433.1573] device (tap23538fed-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:37:13 compute-1 ovn_controller[135204]: 2025-09-30T18:37:13Z|00232|binding|INFO|Releasing lport 23538fed-fc3c-4080-bbea-55e12668af3b from this chassis (sb_readonly=0)
Sep 30 18:37:13 compute-1 ovn_controller[135204]: 2025-09-30T18:37:13Z|00233|binding|INFO|Setting lport 23538fed-fc3c-4080-bbea-55e12668af3b down in Southbound
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 ovn_controller[135204]: 2025-09-30T18:37:13Z|00234|binding|INFO|Removing iface tap23538fed-fc ovn-installed in OVS
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.182 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:4f:5c 10.100.0.10'], port_security=['fa:16:3e:1a:4f:5c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '741d9cb1-7a49-4d89-8b1a-78ae947f2c49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '15', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=23538fed-fc3c-4080-bbea-55e12668af3b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.184 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 23538fed-fc3c-4080-bbea-55e12668af3b in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.187 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6901f664-336b-42d2-bbf7-58951befc8d1
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.218 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[58294592-8623-4444-8f4b-f0c988306ac0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Sep 30 18:37:13 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Consumed 3.802s CPU time.
Sep 30 18:37:13 compute-1 systemd-machined[195911]: Machine qemu-20-instance-0000001b terminated.
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.276 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[046eb7e0-30e4-4043-8385-73434cd943df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.283 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a92d4122-6a05-4230-b4ec-4e562c4d6fe8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.324 2 INFO nova.virt.libvirt.driver [-] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Instance destroyed successfully.
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.325 2 DEBUG nova.objects.instance [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 741d9cb1-7a49-4d89-8b1a-78ae947f2c49 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.344 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[97f4c61d-74fd-4c8e-8738-f23566df8251]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.375 2 DEBUG nova.compute.manager [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Received event network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.376 2 DEBUG oslo_concurrency.lockutils [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.376 2 DEBUG oslo_concurrency.lockutils [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.377 2 DEBUG oslo_concurrency.lockutils [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.377 2 DEBUG nova.compute.manager [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] No waiting events found dispatching network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.377 2 DEBUG nova.compute.manager [req-aef29d65-8401-4a8a-ac76-952a80a56a9d req-c55cc8fb-2c2e-40bf-9a8c-a3b363b7be05 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Received event network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.383 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[062ec2c6-5034-47ed-b3cb-3af44b599f92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6901f664-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:41:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1522889, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292926, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.415 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2f46368d-0815-4fa6-841d-1fbf2f241570]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1522906, 'tstamp': 1522906}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292927, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6901f664-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1522910, 'tstamp': 1522910}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292927, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.417 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.427 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6901f664-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.428 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.428 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6901f664-30, col_values=(('external_ids', {'iface-id': '5b6cbf18-1826-41d0-920f-e9db4f1a1832'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.429 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:37:13 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:13.431 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e92d5d-9e94-4a8a-b017-5eb48be3aaea]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-6901f664-336b-42d2-bbf7-58951befc8d1\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 6901f664-336b-42d2-bbf7-58951befc8d1\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1208745743' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.833 2 DEBUG nova.virt.libvirt.vif [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1259001093',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1259001093',id=27,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:35:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-9g37rry3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:36:37Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=741d9cb1-7a49-4d89-8b1a-78ae947f2c49,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.834 2 DEBUG nova.network.os_vif_util [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "23538fed-fc3c-4080-bbea-55e12668af3b", "address": "fa:16:3e:1a:4f:5c", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23538fed-fc", "ovs_interfaceid": "23538fed-fc3c-4080-bbea-55e12668af3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.836 2 DEBUG nova.network.os_vif_util [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.836 2 DEBUG os_vif [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23538fed-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a31b2041-b777-4968-af2b-d4d0a8f2cf1b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:13 compute-1 nova_compute[238822]: 2025-09-30 18:37:13.893 2 INFO os_vif [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:4f:5c,bridge_name='br-int',has_traffic_filtering=True,id=23538fed-fc3c-4080-bbea-55e12668af3b,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23538fed-fc')
Sep 30 18:37:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:14.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:14 compute-1 sshd-session[292898]: Failed password for invalid user ftpuser1 from 192.210.160.141 port 59946 ssh2
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.422 2 INFO nova.virt.libvirt.driver [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Deleting instance files /var/lib/nova/instances/741d9cb1-7a49-4d89-8b1a-78ae947f2c49_del
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.423 2 INFO nova.virt.libvirt.driver [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Deletion of /var/lib/nova/instances/741d9cb1-7a49-4d89-8b1a-78ae947f2c49_del complete
Sep 30 18:37:14 compute-1 ceph-mon[75484]: pgmap v1760: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 0 B/s wr, 1 op/s
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.940 2 INFO nova.compute.manager [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Took 1.88 seconds to destroy the instance on the hypervisor.
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.941 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.941 2 DEBUG nova.compute.manager [-] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.941 2 DEBUG nova.network.neutron [-] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:37:14 compute-1 nova_compute[238822]: 2025-09-30 18:37:14.942 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:15.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.465 2 DEBUG nova.compute.manager [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Received event network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.465 2 DEBUG oslo_concurrency.lockutils [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.466 2 DEBUG oslo_concurrency.lockutils [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.466 2 DEBUG oslo_concurrency.lockutils [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.466 2 DEBUG nova.compute.manager [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] No waiting events found dispatching network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.467 2 DEBUG nova.compute.manager [req-4d024cdd-aec0-4aaf-a4d8-dd4e7a6fb368 req-54ce4d4b-47c0-4788-b7b9-c791651b6551 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Received event network-vif-unplugged-23538fed-fc3c-4080-bbea-55e12668af3b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:37:15 compute-1 nova_compute[238822]: 2025-09-30 18:37:15.600 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:15 compute-1 sshd-session[292898]: Connection closed by invalid user ftpuser1 192.210.160.141 port 59946 [preauth]
Sep 30 18:37:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:16.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:17.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:17 compute-1 ceph-mon[75484]: pgmap v1761: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:37:17 compute-1 nova_compute[238822]: 2025-09-30 18:37:17.579 2 DEBUG nova.compute.manager [req-d7e743af-92f8-45ae-819d-1a17ac9db95a req-741a2999-c088-4294-b7ef-20be8bec0af4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Received event network-vif-deleted-23538fed-fc3c-4080-bbea-55e12668af3b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:17 compute-1 nova_compute[238822]: 2025-09-30 18:37:17.579 2 INFO nova.compute.manager [req-d7e743af-92f8-45ae-819d-1a17ac9db95a req-741a2999-c088-4294-b7ef-20be8bec0af4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Neutron deleted interface 23538fed-fc3c-4080-bbea-55e12668af3b; detaching it from the instance and deleting it from the info cache
Sep 30 18:37:17 compute-1 nova_compute[238822]: 2025-09-30 18:37:17.579 2 DEBUG nova.network.neutron [req-d7e743af-92f8-45ae-819d-1a17ac9db95a req-741a2999-c088-4294-b7ef-20be8bec0af4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:37:18 compute-1 nova_compute[238822]: 2025-09-30 18:37:18.009 2 DEBUG nova.network.neutron [-] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:37:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:18 compute-1 nova_compute[238822]: 2025-09-30 18:37:18.089 2 DEBUG nova.compute.manager [req-d7e743af-92f8-45ae-819d-1a17ac9db95a req-741a2999-c088-4294-b7ef-20be8bec0af4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Detach interface failed, port_id=23538fed-fc3c-4080-bbea-55e12668af3b, reason: Instance 741d9cb1-7a49-4d89-8b1a-78ae947f2c49 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:37:18 compute-1 sudo[292953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:37:18 compute-1 sudo[292953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:18.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:18 compute-1 sudo[292953]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:18 compute-1 nova_compute[238822]: 2025-09-30 18:37:18.518 2 INFO nova.compute.manager [-] [instance: 741d9cb1-7a49-4d89-8b1a-78ae947f2c49] Took 3.58 seconds to deallocate network for instance.
Sep 30 18:37:18 compute-1 nova_compute[238822]: 2025-09-30 18:37:18.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.046 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.047 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:19.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.125 2 DEBUG oslo_concurrency.processutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:19 compute-1 ceph-mon[75484]: pgmap v1762: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: ERROR   18:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: ERROR   18:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: ERROR   18:37:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: ERROR   18:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: ERROR   18:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:37:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:37:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:37:19 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1675240793' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.646 2 DEBUG oslo_concurrency.processutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:37:19 compute-1 nova_compute[238822]: 2025-09-30 18:37:19.656 2 DEBUG nova.compute.provider_tree [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:37:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:20 compute-1 nova_compute[238822]: 2025-09-30 18:37:20.169 2 DEBUG nova.scheduler.client.report [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:37:20 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1675240793' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:20 compute-1 nova_compute[238822]: 2025-09-30 18:37:20.683 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.636s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:20 compute-1 nova_compute[238822]: 2025-09-30 18:37:20.726 2 INFO nova.scheduler.client.report [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 741d9cb1-7a49-4d89-8b1a-78ae947f2c49
Sep 30 18:37:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:21.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:21 compute-1 ceph-mon[75484]: pgmap v1763: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:37:21 compute-1 nova_compute[238822]: 2025-09-30 18:37:21.761 2 DEBUG oslo_concurrency.lockutils [None req-46329b73-d7a3-48be-8702-692566625f77 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "741d9cb1-7a49-4d89-8b1a-78ae947f2c49" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:22.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.381 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "342a3981-de33-491a-974b-5566045fba97" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.382 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.382 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "342a3981-de33-491a-974b-5566045fba97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.382 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.383 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.398 2 INFO nova.compute.manager [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Terminating instance
Sep 30 18:37:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:37:22 compute-1 sshd-session[293005]: Invalid user web from 14.225.167.110 port 43434
Sep 30 18:37:22 compute-1 sshd-session[293005]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:37:22 compute-1 sshd-session[293005]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.918 2 DEBUG nova.compute.manager [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:37:22 compute-1 kernel: tapf05039eb-b7 (unregistering): left promiscuous mode
Sep 30 18:37:22 compute-1 NetworkManager[45549]: <info>  [1759257442.9716] device (tapf05039eb-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:37:22 compute-1 ovn_controller[135204]: 2025-09-30T18:37:22Z|00235|binding|INFO|Releasing lport f05039eb-b7e1-4072-bc17-63c6787538a1 from this chassis (sb_readonly=0)
Sep 30 18:37:22 compute-1 ovn_controller[135204]: 2025-09-30T18:37:22Z|00236|binding|INFO|Setting lport f05039eb-b7e1-4072-bc17-63c6787538a1 down in Southbound
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:22 compute-1 ovn_controller[135204]: 2025-09-30T18:37:22Z|00237|binding|INFO|Removing iface tapf05039eb-b7 ovn-installed in OVS
Sep 30 18:37:22 compute-1 nova_compute[238822]: 2025-09-30 18:37:22.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:22.993 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:d4:b5 10.100.0.7'], port_security=['fa:16:3e:84:d4:b5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '342a3981-de33-491a-974b-5566045fba97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6901f664-336b-42d2-bbf7-58951befc8d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c634e1c17ed54907969576a0eb8eff50', 'neutron:revision_number': '14', 'neutron:security_group_ids': '11cf84c1-9641-409c-b5f5-6c5fe3a8afe5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c24d9de-651b-4bf8-842a-1286ab88b11d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=f05039eb-b7e1-4072-bc17-63c6787538a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:37:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:22.994 144543 INFO neutron.agent.ovn.metadata.agent [-] Port f05039eb-b7e1-4072-bc17-63c6787538a1 in datapath 6901f664-336b-42d2-bbf7-58951befc8d1 unbound from our chassis
Sep 30 18:37:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:22.997 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6901f664-336b-42d2-bbf7-58951befc8d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:37:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:22.998 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[99b2f24d-ae11-4d3c-b631-c853dabb58b0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:22.999 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 namespace which is not needed anymore
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:23 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Sep 30 18:37:23 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Consumed 3.449s CPU time.
Sep 30 18:37:23 compute-1 systemd-machined[195911]: Machine qemu-21-instance-0000001a terminated.
Sep 30 18:37:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:23.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.181 2 INFO nova.virt.libvirt.driver [-] [instance: 342a3981-de33-491a-974b-5566045fba97] Instance destroyed successfully.
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.182 2 DEBUG nova.objects.instance [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lazy-loading 'resources' on Instance uuid 342a3981-de33-491a-974b-5566045fba97 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:37:23 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [NOTICE]   (292461) : haproxy version is 3.0.5-8e879a5
Sep 30 18:37:23 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [NOTICE]   (292461) : path to executable is /usr/sbin/haproxy
Sep 30 18:37:23 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [WARNING]  (292461) : Exiting Master process...
Sep 30 18:37:23 compute-1 podman[293036]: 2025-09-30 18:37:23.227862086 +0000 UTC m=+0.067291399 container kill 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:37:23 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [ALERT]    (292461) : Current worker (292463) exited with code 143 (Terminated)
Sep 30 18:37:23 compute-1 neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1[292457]: [WARNING]  (292461) : All workers exited. Exiting... (0)
Sep 30 18:37:23 compute-1 systemd[1]: libpod-1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757.scope: Deactivated successfully.
Sep 30 18:37:23 compute-1 podman[293059]: 2025-09-30 18:37:23.304401493 +0000 UTC m=+0.049958514 container died 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.311 2 DEBUG nova.compute.manager [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Received event network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.312 2 DEBUG oslo_concurrency.lockutils [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "342a3981-de33-491a-974b-5566045fba97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.312 2 DEBUG oslo_concurrency.lockutils [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.312 2 DEBUG oslo_concurrency.lockutils [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.312 2 DEBUG nova.compute.manager [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] No waiting events found dispatching network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.312 2 DEBUG nova.compute.manager [req-60cad5a3-0f53-4802-a88e-bfd6b4d70546 req-479ac921-613d-4442-b0ed-adcdb52b77c2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Received event network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:37:23 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757-userdata-shm.mount: Deactivated successfully.
Sep 30 18:37:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-b0772e2b18e3975d0f1277cc80688bf4685bbe30af3143d69f4936ba656426ca-merged.mount: Deactivated successfully.
Sep 30 18:37:23 compute-1 podman[293059]: 2025-09-30 18:37:23.370541061 +0000 UTC m=+0.116098042 container cleanup 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:37:23 compute-1 systemd[1]: libpod-conmon-1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757.scope: Deactivated successfully.
Sep 30 18:37:23 compute-1 podman[293066]: 2025-09-30 18:37:23.404440462 +0000 UTC m=+0.130089708 container remove 1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.417 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[aa7d6c98-75ec-48eb-bccd-2cba10884ebf]: (4, ("Tue Sep 30 06:37:23 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757)\n1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757\nTue Sep 30 06:37:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 (1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757)\n1b3a5440f60cd68d5984bca3fa0654f5590aa7fc10d5177fb8fd117ee327d757\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.420 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3769cd70-f5b7-4a94-92a0-36234f8fa479]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.420 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6901f664-336b-42d2-bbf7-58951befc8d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.422 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9ff326-2f96-4951-a52b-f24bdc422f71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.423 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6901f664-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 kernel: tap6901f664-30: left promiscuous mode
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 ceph-mon[75484]: pgmap v1764: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.464 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6df7209c-5c42-4ad4-a4db-aa1264c0aa51]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.500 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[32dde4b4-8a9d-4578-b532-aea486992120]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.502 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4f1c55-4f74-4933-b143-c54fc61b34c1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 sshd-session[293041]: Invalid user sol from 45.148.10.240 port 44206
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.535 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6cae04-4ac7-41f2-aaf2-33302028bb6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1522877, 'reachable_time': 32245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293098, 'error': None, 'target': 'ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 systemd[1]: run-netns-ovnmeta\x2d6901f664\x2d336b\x2d42d2\x2dbbf7\x2d58951befc8d1.mount: Deactivated successfully.
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.549 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6901f664-336b-42d2-bbf7-58951befc8d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:37:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:23.550 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[9b73a00f-53bd-4cb0-ba37-c0e4049876a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:23 compute-1 sudo[293093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:37:23 compute-1 sshd-session[293041]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:37:23 compute-1 sshd-session[293041]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.148.10.240
Sep 30 18:37:23 compute-1 sudo[293093]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:23 compute-1 sudo[293093]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.690 2 DEBUG nova.virt.libvirt.vif [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:34:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2049754641',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2049754641',id=26,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:35:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c634e1c17ed54907969576a0eb8eff50',ramdisk_id='',reservation_id='r-ryu3h0vp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1883747907',owner_user_name='tempest-TestExecuteStrategies-1883747907-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:37:08Z,user_data=None,user_id='623ef4a55c9e4fc28bb65e49246b5008',uuid=342a3981-de33-491a-974b-5566045fba97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.691 2 DEBUG nova.network.os_vif_util [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converting VIF {"id": "f05039eb-b7e1-4072-bc17-63c6787538a1", "address": "fa:16:3e:84:d4:b5", "network": {"id": "6901f664-336b-42d2-bbf7-58951befc8d1", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1634944481-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08fc2cbd16474855b7ae474fa9859f76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf05039eb-b7", "ovs_interfaceid": "f05039eb-b7e1-4072-bc17-63c6787538a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.692 2 DEBUG nova.network.os_vif_util [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.692 2 DEBUG os_vif [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf05039eb-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:23 compute-1 sudo[293119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:37:23 compute-1 sudo[293119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.745 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=168b4cbe-8585-4f7f-948d-b951417a6be9) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:23 compute-1 nova_compute[238822]: 2025-09-30 18:37:23.750 2 INFO os_vif [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:d4:b5,bridge_name='br-int',has_traffic_filtering=True,id=f05039eb-b7e1-4072-bc17-63c6787538a1,network=Network(6901f664-336b-42d2-bbf7-58951befc8d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf05039eb-b7')
Sep 30 18:37:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.282 2 INFO nova.virt.libvirt.driver [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Deleting instance files /var/lib/nova/instances/342a3981-de33-491a-974b-5566045fba97_del
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.283 2 INFO nova.virt.libvirt.driver [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Deletion of /var/lib/nova/instances/342a3981-de33-491a-974b-5566045fba97_del complete
Sep 30 18:37:24 compute-1 sudo[293119]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:37:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:37:24 compute-1 sshd-session[293005]: Failed password for invalid user web from 14.225.167.110 port 43434 ssh2
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.798 2 INFO nova.compute.manager [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Took 1.88 seconds to destroy the instance on the hypervisor.
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.799 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.799 2 DEBUG nova.compute.manager [-] [instance: 342a3981-de33-491a-974b-5566045fba97] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.799 2 DEBUG nova.network.neutron [-] [instance: 342a3981-de33-491a-974b-5566045fba97] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.800 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:24 compute-1 nova_compute[238822]: 2025-09-30 18:37:24.920 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:37:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:25.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.214 2 DEBUG nova.compute.manager [req-761716ee-e58a-4619-be1d-ac88c73c07f2 req-7cdd72d7-55bc-4e46-873e-29c78171ad77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Received event network-vif-deleted-f05039eb-b7e1-4072-bc17-63c6787538a1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.214 2 INFO nova.compute.manager [req-761716ee-e58a-4619-be1d-ac88c73c07f2 req-7cdd72d7-55bc-4e46-873e-29c78171ad77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Neutron deleted interface f05039eb-b7e1-4072-bc17-63c6787538a1; detaching it from the instance and deleting it from the info cache
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.215 2 DEBUG nova.network.neutron [req-761716ee-e58a-4619-be1d-ac88c73c07f2 req-7cdd72d7-55bc-4e46-873e-29c78171ad77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.383 2 DEBUG nova.compute.manager [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Received event network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.383 2 DEBUG oslo_concurrency.lockutils [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "342a3981-de33-491a-974b-5566045fba97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.384 2 DEBUG oslo_concurrency.lockutils [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.384 2 DEBUG oslo_concurrency.lockutils [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.384 2 DEBUG nova.compute.manager [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] No waiting events found dispatching network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.384 2 DEBUG nova.compute.manager [req-2521c595-f019-48ba-ac3e-1bdaaaea0534 req-8f98af2c-2d17-431a-8d92-4ae5b7adc191 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Received event network-vif-unplugged-f05039eb-b7e1-4072-bc17-63c6787538a1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:37:25 compute-1 sshd-session[293041]: Failed password for invalid user sol from 45.148.10.240 port 44206 ssh2
Sep 30 18:37:25 compute-1 ceph-mon[75484]: pgmap v1765: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:37:25 compute-1 ceph-mon[75484]: pgmap v1766: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.665 2 DEBUG nova.network.neutron [-] [instance: 342a3981-de33-491a-974b-5566045fba97] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:37:25 compute-1 nova_compute[238822]: 2025-09-30 18:37:25.724 2 DEBUG nova.compute.manager [req-761716ee-e58a-4619-be1d-ac88c73c07f2 req-7cdd72d7-55bc-4e46-873e-29c78171ad77 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 342a3981-de33-491a-974b-5566045fba97] Detach interface failed, port_id=f05039eb-b7e1-4072-bc17-63c6787538a1, reason: Instance 342a3981-de33-491a-974b-5566045fba97 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:37:25 compute-1 sshd-session[293041]: Connection closed by invalid user sol 45.148.10.240 port 44206 [preauth]
Sep 30 18:37:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.094561) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446094610, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 783, "num_deletes": 259, "total_data_size": 1303471, "memory_usage": 1330336, "flush_reason": "Manual Compaction"}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446103875, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 852435, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48698, "largest_seqno": 49476, "table_properties": {"data_size": 848866, "index_size": 1349, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8476, "raw_average_key_size": 18, "raw_value_size": 841465, "raw_average_value_size": 1882, "num_data_blocks": 60, "num_entries": 447, "num_filter_entries": 447, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257397, "oldest_key_time": 1759257397, "file_creation_time": 1759257446, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 9370 microseconds, and 5810 cpu microseconds.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.103931) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 852435 bytes OK
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.103960) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.105899) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.105921) EVENT_LOG_v1 {"time_micros": 1759257446105914, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.105945) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1299303, prev total WAL file size 1299303, number of live WAL files 2.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.106993) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353032' seq:72057594037927935, type:22 .. '6C6F676D0031373537' seq:0, type:0; will stop at (end)
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(832KB)], [96(10MB)]
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446107065, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 11944794, "oldest_snapshot_seqno": -1}
Sep 30 18:37:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:26.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:26 compute-1 nova_compute[238822]: 2025-09-30 18:37:26.172 2 INFO nova.compute.manager [-] [instance: 342a3981-de33-491a-974b-5566045fba97] Took 1.37 seconds to deallocate network for instance.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 6867 keys, 11834215 bytes, temperature: kUnknown
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446186406, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 11834215, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11792038, "index_size": 23889, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17221, "raw_key_size": 181244, "raw_average_key_size": 26, "raw_value_size": 11672343, "raw_average_value_size": 1699, "num_data_blocks": 935, "num_entries": 6867, "num_filter_entries": 6867, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257446, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.186811) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 11834215 bytes
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.188604) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.4 rd, 149.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 10.6 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(27.9) write-amplify(13.9) OK, records in: 7397, records dropped: 530 output_compression: NoCompression
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.188663) EVENT_LOG_v1 {"time_micros": 1759257446188648, "job": 60, "event": "compaction_finished", "compaction_time_micros": 79442, "compaction_time_cpu_micros": 38138, "output_level": 6, "num_output_files": 1, "total_output_size": 11834215, "num_input_records": 7397, "num_output_records": 6867, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446189112, "job": 60, "event": "table_file_deletion", "file_number": 98}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257446193133, "job": 60, "event": "table_file_deletion", "file_number": 96}
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.106852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.193699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.193705) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.193708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.193711) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:26.193713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:26 compute-1 sshd-session[293005]: Received disconnect from 14.225.167.110 port 43434:11: Bye Bye [preauth]
Sep 30 18:37:26 compute-1 sshd-session[293005]: Disconnected from invalid user web 14.225.167.110 port 43434 [preauth]
Sep 30 18:37:26 compute-1 podman[293198]: 2025-09-30 18:37:26.570263506 +0000 UTC m=+0.095573060 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:37:26 compute-1 podman[293197]: 2025-09-30 18:37:26.625344655 +0000 UTC m=+0.155551290 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:37:26 compute-1 nova_compute[238822]: 2025-09-30 18:37:26.699 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:26 compute-1 nova_compute[238822]: 2025-09-30 18:37:26.700 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:26 compute-1 nova_compute[238822]: 2025-09-30 18:37:26.706 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:26 compute-1 nova_compute[238822]: 2025-09-30 18:37:26.760 2 INFO nova.scheduler.client.report [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Deleted allocations for instance 342a3981-de33-491a-974b-5566045fba97
Sep 30 18:37:26 compute-1 sshd-session[292951]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:37:26 compute-1 sshd-session[292951]: banner exchange: Connection from 110.42.70.108 port 49182: Connection timed out
Sep 30 18:37:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:27 compute-1 ceph-mon[75484]: pgmap v1767: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:27.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:27 compute-1 nova_compute[238822]: 2025-09-30 18:37:27.795 2 DEBUG oslo_concurrency.lockutils [None req-a1676216-c2e0-4b74-bc85-defd5dbf6455 623ef4a55c9e4fc28bb65e49246b5008 c634e1c17ed54907969576a0eb8eff50 - - default default] Lock "342a3981-de33-491a-974b-5566045fba97" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.413s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.452721) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448453003, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 281, "num_deletes": 251, "total_data_size": 117649, "memory_usage": 123624, "flush_reason": "Manual Compaction"}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448456034, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 76971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49481, "largest_seqno": 49757, "table_properties": {"data_size": 75043, "index_size": 156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4859, "raw_average_key_size": 18, "raw_value_size": 71360, "raw_average_value_size": 268, "num_data_blocks": 7, "num_entries": 266, "num_filter_entries": 266, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257447, "oldest_key_time": 1759257447, "file_creation_time": 1759257448, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 3511 microseconds, and 1294 cpu microseconds.
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.456243) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 76971 bytes OK
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.456328) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.457939) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.457964) EVENT_LOG_v1 {"time_micros": 1759257448457956, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.457987) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 115527, prev total WAL file size 115527, number of live WAL files 2.
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.458946) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(75KB)], [99(11MB)]
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448459021, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 11911186, "oldest_snapshot_seqno": -1}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 6624 keys, 9955948 bytes, temperature: kUnknown
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448523478, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 9955948, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9916861, "index_size": 21439, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 176884, "raw_average_key_size": 26, "raw_value_size": 9802783, "raw_average_value_size": 1479, "num_data_blocks": 825, "num_entries": 6624, "num_filter_entries": 6624, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257448, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.523865) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 9955948 bytes
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.525601) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.5 rd, 154.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.3 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(284.1) write-amplify(129.3) OK, records in: 7133, records dropped: 509 output_compression: NoCompression
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.525711) EVENT_LOG_v1 {"time_micros": 1759257448525691, "job": 62, "event": "compaction_finished", "compaction_time_micros": 64556, "compaction_time_cpu_micros": 30627, "output_level": 6, "num_output_files": 1, "total_output_size": 9955948, "num_input_records": 7133, "num_output_records": 6624, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448525977, "job": 62, "event": "table_file_deletion", "file_number": 101}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257448531218, "job": 62, "event": "table_file_deletion", "file_number": 99}
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.458846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.531383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.531394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.531396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.531398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:37:28.531400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:37:28 compute-1 nova_compute[238822]: 2025-09-30 18:37:28.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:29 compute-1 sudo[293251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:37:29 compute-1 sudo[293251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:29 compute-1 sudo[293251]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:29 compute-1 nova_compute[238822]: 2025-09-30 18:37:29.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:29.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:29 compute-1 ceph-mon[75484]: pgmap v1768: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:37:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:37:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:29.901 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:37:29 compute-1 nova_compute[238822]: 2025-09-30 18:37:29.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:29.903 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:37:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:31 compute-1 ceph-mon[75484]: pgmap v1769: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:32.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:32 compute-1 nova_compute[238822]: 2025-09-30 18:37:32.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:33 compute-1 ceph-mon[75484]: pgmap v1770: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:33 compute-1 podman[293281]: 2025-09-30 18:37:33.569009919 +0000 UTC m=+0.098975581 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 18:37:33 compute-1 nova_compute[238822]: 2025-09-30 18:37:33.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:33 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:33.905 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:37:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:34.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:34 compute-1 nova_compute[238822]: 2025-09-30 18:37:34.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:35.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:35 compute-1 ceph-mon[75484]: pgmap v1771: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:37:35 compute-1 podman[249638]: time="2025-09-30T18:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:37:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:37:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8350 "" "Go-http-client/1.1"
Sep 30 18:37:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:36.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2228795638' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:37:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2228795638' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:37:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:37.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:37 compute-1 ceph-mon[75484]: pgmap v1772: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:37:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:37:37 compute-1 sshd-session[293303]: Invalid user steam from 80.94.95.115 port 30644
Sep 30 18:37:37 compute-1 sshd-session[293303]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:37:37 compute-1 sshd-session[293303]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115
Sep 30 18:37:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:37:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:38.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:37:38 compute-1 sudo[293307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:37:38 compute-1 sudo[293307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:38 compute-1 sudo[293307]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:38 compute-1 nova_compute[238822]: 2025-09-30 18:37:38.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:39.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:39 compute-1 nova_compute[238822]: 2025-09-30 18:37:39.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:39 compute-1 ceph-mon[75484]: pgmap v1773: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:37:39 compute-1 podman[293336]: 2025-09-30 18:37:39.554674217 +0000 UTC m=+0.093453943 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:37:39 compute-1 podman[293338]: 2025-09-30 18:37:39.568779196 +0000 UTC m=+0.092894588 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 18:37:39 compute-1 podman[293337]: 2025-09-30 18:37:39.575494926 +0000 UTC m=+0.106449882 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Sep 30 18:37:39 compute-1 sshd-session[293303]: Failed password for invalid user steam from 80.94.95.115 port 30644 ssh2
Sep 30 18:37:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:40.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:40 compute-1 sshd-session[293303]: Connection closed by invalid user steam 80.94.95.115 port 30644 [preauth]
Sep 30 18:37:40 compute-1 unix_chkpwd[293395]: password check failed for user (root)
Sep 30 18:37:40 compute-1 sshd-session[293333]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:37:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:41.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:41 compute-1 ceph-mon[75484]: pgmap v1774: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:37:41 compute-1 sshd-session[293333]: Failed password for root from 192.210.160.141 port 45716 ssh2
Sep 30 18:37:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:42.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:43.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:43 compute-1 ceph-mon[75484]: pgmap v1775: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:43 compute-1 sshd-session[293333]: Connection closed by authenticating user root 192.210.160.141 port 45716 [preauth]
Sep 30 18:37:43 compute-1 nova_compute[238822]: 2025-09-30 18:37:43.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:44 compute-1 nova_compute[238822]: 2025-09-30 18:37:44.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:44.965 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:e3:e0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9692f6197b3545b1bf37bd84c3928d41', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef43a77-fc58-48dd-8195-5e83e09646ef, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=475706b8-809c-4da9-92ac-7152f6d17fbe) old=Port_Binding(mac=['fa:16:3e:69:e3:e0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9692f6197b3545b1bf37bd84c3928d41', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:37:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:44.966 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 475706b8-809c-4da9-92ac-7152f6d17fbe in datapath e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 updated
Sep 30 18:37:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:44.967 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:37:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:44.968 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7a6277-45ce-4f4b-bd49-cd7d806bc3bd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:45 compute-1 nova_compute[238822]: 2025-09-30 18:37:45.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:45.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:45 compute-1 ceph-mon[75484]: pgmap v1776: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:37:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:46 compute-1 nova_compute[238822]: 2025-09-30 18:37:46.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:46 compute-1 nova_compute[238822]: 2025-09-30 18:37:46.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:37:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:46.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:47.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:37:47 compute-1 nova_compute[238822]: 2025-09-30 18:37:47.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:37:47 compute-1 ceph-mon[75484]: pgmap v1777: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:37:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3838396643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.041 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:37:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:37:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:48.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.282 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.284 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.317 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.318 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4752MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.319 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.319 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3838396643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:48 compute-1 nova_compute[238822]: 2025-09-30 18:37:48.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.379 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.380 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:37:48 up  4:15,  0 user,  load average: 0.38, 0.40, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.407 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: ERROR   18:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: ERROR   18:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: ERROR   18:37:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: ERROR   18:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: ERROR   18:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:37:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:37:49 compute-1 ceph-mon[75484]: pgmap v1778: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:37:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/245067866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.898 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:37:49 compute-1 nova_compute[238822]: 2025-09-30 18:37:49.907 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:37:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:50.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:50 compute-1 nova_compute[238822]: 2025-09-30 18:37:50.417 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:37:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/245067866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2736378507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:50 compute-1 ceph-mon[75484]: pgmap v1779: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:37:50 compute-1 nova_compute[238822]: 2025-09-30 18:37:50.933 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:37:50 compute-1 nova_compute[238822]: 2025-09-30 18:37:50.934 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.615s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:51.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2276774194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:37:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:52.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:52 compute-1 nova_compute[238822]: 2025-09-30 18:37:52.936 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:52 compute-1 nova_compute[238822]: 2025-09-30 18:37:52.937 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:52 compute-1 nova_compute[238822]: 2025-09-30 18:37:52.937 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:53 compute-1 nova_compute[238822]: 2025-09-30 18:37:53.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:37:53 compute-1 ceph-mon[75484]: pgmap v1780: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:53 compute-1 nova_compute[238822]: 2025-09-30 18:37:53.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:54.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:54 compute-1 nova_compute[238822]: 2025-09-30 18:37:54.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.233 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:a5:37 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b9e368f8-9637-474a-a0f3-2785ed8b6bea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9e368f8-9637-474a-a0f3-2785ed8b6bea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=340490f5-0a9d-48f0-992c-0f05c72a9a6c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3de11ef7-a7bb-4299-b982-f045dbd9b956) old=Port_Binding(mac=['fa:16:3e:db:a5:37'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b9e368f8-9637-474a-a0f3-2785ed8b6bea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9e368f8-9637-474a-a0f3-2785ed8b6bea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.234 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3de11ef7-a7bb-4299-b982-f045dbd9b956 in datapath b9e368f8-9637-474a-a0f3-2785ed8b6bea updated
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.235 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9e368f8-9637-474a-a0f3-2785ed8b6bea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.236 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e6349e-424c-4b58-924e-0e8342620176]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.405 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.406 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:37:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:37:54.406 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:37:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:55 compute-1 ceph-mon[75484]: pgmap v1781: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:37:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:56.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:57 compute-1 podman[293463]: 2025-09-30 18:37:57.54678335 +0000 UTC m=+0.078738567 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:37:57 compute-1 ceph-mon[75484]: pgmap v1782: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:57 compute-1 podman[293462]: 2025-09-30 18:37:57.588797139 +0000 UTC m=+0.131413533 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 18:37:57 compute-1 sshd-session[293460]: Invalid user admin from 8.243.64.201 port 34864
Sep 30 18:37:57 compute-1 sshd-session[293460]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:37:57 compute-1 sshd-session[293460]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:37:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:37:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:37:58.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:37:58 compute-1 sudo[293512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:37:58 compute-1 sudo[293512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:37:58 compute-1 sudo[293512]: pam_unix(sudo:session): session closed for user root
Sep 30 18:37:58 compute-1 nova_compute[238822]: 2025-09-30 18:37:58.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:37:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:37:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:37:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:37:59 compute-1 nova_compute[238822]: 2025-09-30 18:37:59.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:59 compute-1 nova_compute[238822]: 2025-09-30 18:37:59.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:37:59 compute-1 nova_compute[238822]: 2025-09-30 18:37:59.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:37:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:37:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:37:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:37:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:37:59 compute-1 ceph-mon[75484]: pgmap v1783: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:37:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3600.0 total, 600.0 interval
                                           Cumulative writes: 9784 writes, 50K keys, 9784 commit groups, 1.0 writes per commit group, ingest: 0.11 GB, 0.03 MB/s
                                           Cumulative WAL: 9784 writes, 9784 syncs, 1.00 writes per sync, written: 0.11 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1514 writes, 8083 keys, 1514 commit groups, 1.0 writes per commit group, ingest: 16.05 MB, 0.03 MB/s
                                           Interval WAL: 1514 writes, 1514 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    130.9      0.50              0.26        31    0.016       0      0       0.0       0.0
                                             L6      1/0    9.49 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   5.2    174.7    150.0      2.25              1.20        30    0.075    181K    16K       0.0       0.0
                                            Sum      1/0    9.49 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   6.2    143.0    146.5      2.74              1.46        61    0.045    181K    16K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   9.8    142.9    137.9      0.74              0.42        16    0.046     58K   4111       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0    174.7    150.0      2.25              1.20        30    0.075    181K    16K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    131.5      0.50              0.26        30    0.017       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 3600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.064, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.39 GB write, 0.11 MB/s write, 0.38 GB read, 0.11 MB/s read, 2.7 seconds
                                           Interval compaction: 0.10 GB write, 0.17 MB/s write, 0.10 GB read, 0.18 MB/s read, 0.7 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 37.40 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000502 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2092,36.12 MB,11.8814%) FilterBlock(61,508.73 KB,0.163425%) IndexBlock(61,798.17 KB,0.256403%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:37:59 compute-1 sshd-session[293460]: Failed password for invalid user admin from 8.243.64.201 port 34864 ssh2
Sep 30 18:38:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:00.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:00 compute-1 sshd-session[293460]: Received disconnect from 8.243.64.201 port 34864:11: Bye Bye [preauth]
Sep 30 18:38:00 compute-1 sshd-session[293460]: Disconnected from invalid user admin 8.243.64.201 port 34864 [preauth]
Sep 30 18:38:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:01.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:01 compute-1 ceph-mon[75484]: pgmap v1784: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:38:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:02.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:38:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:03.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:38:03 compute-1 ceph-mon[75484]: pgmap v1785: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:38:03 compute-1 nova_compute[238822]: 2025-09-30 18:38:03.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:04 compute-1 nova_compute[238822]: 2025-09-30 18:38:04.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:04.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:04 compute-1 podman[293544]: 2025-09-30 18:38:04.535129945 +0000 UTC m=+0.074973336 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Sep 30 18:38:04 compute-1 ceph-mon[75484]: pgmap v1786: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:38:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:05.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:05 compute-1 sshd-session[293564]: Invalid user test from 167.172.43.167 port 59604
Sep 30 18:38:05 compute-1 sshd-session[293564]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:38:05 compute-1 sshd-session[293564]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:38:05 compute-1 podman[249638]: time="2025-09-30T18:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:38:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:38:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8359 "" "Go-http-client/1.1"
Sep 30 18:38:05 compute-1 unix_chkpwd[293567]: password check failed for user (root)
Sep 30 18:38:05 compute-1 sshd-session[293543]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:38:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3689806097' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:06.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:06 compute-1 ovn_controller[135204]: 2025-09-30T18:38:06Z|00238|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 18:38:06 compute-1 ceph-mon[75484]: pgmap v1787: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:38:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:07.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:38:07 compute-1 sshd-session[293564]: Failed password for invalid user test from 167.172.43.167 port 59604 ssh2
Sep 30 18:38:07 compute-1 sshd-session[293564]: Received disconnect from 167.172.43.167 port 59604:11: Bye Bye [preauth]
Sep 30 18:38:07 compute-1 sshd-session[293564]: Disconnected from invalid user test 167.172.43.167 port 59604 [preauth]
Sep 30 18:38:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:08 compute-1 sshd-session[293543]: Failed password for root from 192.210.160.141 port 42888 ssh2
Sep 30 18:38:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:08.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:08.627 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:38:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:08.628 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:38:08 compute-1 nova_compute[238822]: 2025-09-30 18:38:08.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:08 compute-1 nova_compute[238822]: 2025-09-30 18:38:08.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:08 compute-1 sshd-session[293543]: Connection closed by authenticating user root 192.210.160.141 port 42888 [preauth]
Sep 30 18:38:09 compute-1 ceph-mon[75484]: pgmap v1788: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:38:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:38:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:09.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:38:09 compute-1 nova_compute[238822]: 2025-09-30 18:38:09.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:10.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:10 compute-1 podman[293574]: 2025-09-30 18:38:10.560575021 +0000 UTC m=+0.102466055 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:38:10 compute-1 podman[293575]: 2025-09-30 18:38:10.594578474 +0000 UTC m=+0.129703486 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:38:10 compute-1 podman[293576]: 2025-09-30 18:38:10.626754119 +0000 UTC m=+0.156659541 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:38:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:11.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:11 compute-1 ceph-mon[75484]: pgmap v1789: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:38:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:12.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:12 compute-1 ceph-mon[75484]: pgmap v1790: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:38:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:13.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:13 compute-1 nova_compute[238822]: 2025-09-30 18:38:13.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:14.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:14 compute-1 nova_compute[238822]: 2025-09-30 18:38:14.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3203473178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:15.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:15.632 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:15 compute-1 ceph-mon[75484]: pgmap v1791: 353 pgs: 353 active+clean; 88 MiB data, 350 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:38:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/288309622' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:16.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:16 compute-1 ceph-mon[75484]: pgmap v1792: 353 pgs: 353 active+clean; 88 MiB data, 350 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:38:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:17.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:18.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:18 compute-1 sudo[293640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:38:18 compute-1 sudo[293640]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:18 compute-1 sudo[293640]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:18 compute-1 nova_compute[238822]: 2025-09-30 18:38:18.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:19.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:19 compute-1 nova_compute[238822]: 2025-09-30 18:38:19.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: ERROR   18:38:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: ERROR   18:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: ERROR   18:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: ERROR   18:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: ERROR   18:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:38:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:38:19 compute-1 ceph-mon[75484]: pgmap v1793: 353 pgs: 353 active+clean; 88 MiB data, 350 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:38:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:20.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:20 compute-1 ceph-mon[75484]: pgmap v1794: 353 pgs: 353 active+clean; 88 MiB data, 350 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:38:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:21.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:21 compute-1 sshd-session[293668]: Invalid user test from 103.153.190.105 port 33078
Sep 30 18:38:21 compute-1 sshd-session[293668]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:38:21 compute-1 sshd-session[293668]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:38:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:22.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:38:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3600.1 total, 600.0 interval
                                           Cumulative writes: 18K writes, 73K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.02 MB/s
                                           Cumulative WAL: 18K writes, 5879 syncs, 3.22 writes per sync, written: 0.07 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2910 writes, 11K keys, 2910 commit groups, 1.0 writes per commit group, ingest: 12.65 MB, 0.02 MB/s
                                           Interval WAL: 2910 writes, 1118 syncs, 2.60 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:38:22 compute-1 ceph-osd[78006]: bluestore.MempoolThread fragmentation_score=0.000897 took=0.000048s
Sep 30 18:38:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:23.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:23 compute-1 ceph-mon[75484]: pgmap v1795: 353 pgs: 353 active+clean; 88 MiB data, 350 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:38:23 compute-1 sshd-session[293668]: Failed password for invalid user test from 103.153.190.105 port 33078 ssh2
Sep 30 18:38:23 compute-1 nova_compute[238822]: 2025-09-30 18:38:23.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:24.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:24 compute-1 nova_compute[238822]: 2025-09-30 18:38:24.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:24 compute-1 sshd-session[293668]: Received disconnect from 103.153.190.105 port 33078:11: Bye Bye [preauth]
Sep 30 18:38:24 compute-1 sshd-session[293668]: Disconnected from invalid user test 103.153.190.105 port 33078 [preauth]
Sep 30 18:38:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:25.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:25 compute-1 ceph-mon[75484]: pgmap v1796: 353 pgs: 353 active+clean; 88 MiB data, 351 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:38:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:38:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:26.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:38:26 compute-1 nova_compute[238822]: 2025-09-30 18:38:26.240 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:26 compute-1 nova_compute[238822]: 2025-09-30 18:38:26.241 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:26 compute-1 nova_compute[238822]: 2025-09-30 18:38:26.749 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:38:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:27.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:27 compute-1 nova_compute[238822]: 2025-09-30 18:38:27.302 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:27 compute-1 nova_compute[238822]: 2025-09-30 18:38:27.303 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:27 compute-1 nova_compute[238822]: 2025-09-30 18:38:27.312 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:38:27 compute-1 nova_compute[238822]: 2025-09-30 18:38:27.313 2 INFO nova.compute.claims [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:38:27 compute-1 ceph-mon[75484]: pgmap v1797: 353 pgs: 353 active+clean; 88 MiB data, 351 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:38:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:28.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:28 compute-1 nova_compute[238822]: 2025-09-30 18:38:28.365 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:28 compute-1 podman[293680]: 2025-09-30 18:38:28.557423883 +0000 UTC m=+0.085309534 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:38:28 compute-1 podman[293679]: 2025-09-30 18:38:28.604847257 +0000 UTC m=+0.144354620 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 18:38:28 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:38:28 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1467164217' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:28 compute-1 nova_compute[238822]: 2025-09-30 18:38:28.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:28 compute-1 nova_compute[238822]: 2025-09-30 18:38:28.896 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:28 compute-1 nova_compute[238822]: 2025-09-30 18:38:28.905 2 DEBUG nova.compute.provider_tree [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:38:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:29 compute-1 sudo[293749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:38:29 compute-1 sudo[293749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:29 compute-1 sudo[293749]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:38:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:29.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:38:29 compute-1 nova_compute[238822]: 2025-09-30 18:38:29.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:29 compute-1 sudo[293774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:38:29 compute-1 sudo[293774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:29 compute-1 nova_compute[238822]: 2025-09-30 18:38:29.417 2 DEBUG nova.scheduler.client.report [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:38:29 compute-1 ceph-mon[75484]: pgmap v1798: 353 pgs: 353 active+clean; 88 MiB data, 351 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:38:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1467164217' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:29 compute-1 sudo[293774]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:29 compute-1 nova_compute[238822]: 2025-09-30 18:38:29.935 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.632s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:29 compute-1 nova_compute[238822]: 2025-09-30 18:38:29.936 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:38:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:38:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:30.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:38:30 compute-1 nova_compute[238822]: 2025-09-30 18:38:30.449 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:38:30 compute-1 nova_compute[238822]: 2025-09-30 18:38:30.450 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:38:30 compute-1 nova_compute[238822]: 2025-09-30 18:38:30.450 2 WARNING neutronclient.v2_0.client [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:38:30 compute-1 nova_compute[238822]: 2025-09-30 18:38:30.451 2 WARNING neutronclient.v2_0.client [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:38:30 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:38:30 compute-1 nova_compute[238822]: 2025-09-30 18:38:30.961 2 INFO nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:38:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:31.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:31 compute-1 nova_compute[238822]: 2025-09-30 18:38:31.473 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:38:31 compute-1 ceph-mon[75484]: pgmap v1799: 353 pgs: 353 active+clean; 88 MiB data, 351 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 14 KiB/s wr, 77 op/s
Sep 30 18:38:31 compute-1 nova_compute[238822]: 2025-09-30 18:38:31.706 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Successfully created port: df28957b-3291-47e1-8119-50e815a35337 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:38:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:32 compute-1 unix_chkpwd[293837]: password check failed for user (root)
Sep 30 18:38:32 compute-1 sshd-session[293832]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:38:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:32.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:32 compute-1 unix_chkpwd[293839]: password check failed for user (root)
Sep 30 18:38:32 compute-1 sshd-session[293835]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.225.167.110  user=root
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.499 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.501 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.502 2 INFO nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Creating image(s)
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.547 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.589 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.635 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.642 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.741 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.745 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.747 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.747 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.787 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.793 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:32 compute-1 nova_compute[238822]: 2025-09-30 18:38:32.941 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Successfully updated port: df28957b-3291-47e1-8119-50e815a35337 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.018 2 DEBUG nova.compute.manager [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-changed-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.020 2 DEBUG nova.compute.manager [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Refreshing instance network info cache due to event network-changed-df28957b-3291-47e1-8119-50e815a35337. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.020 2 DEBUG oslo_concurrency.lockutils [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.021 2 DEBUG oslo_concurrency.lockutils [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.021 2 DEBUG nova.network.neutron [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Refreshing network info cache for port df28957b-3291-47e1-8119-50e815a35337 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:38:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:33.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.297 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.407 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] resizing rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.450 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.528 2 WARNING neutronclient.v2_0.client [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:38:33 compute-1 ceph-mon[75484]: pgmap v1800: 353 pgs: 353 active+clean; 88 MiB data, 351 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 14 KiB/s wr, 72 op/s
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.620 2 DEBUG nova.network.neutron [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.748 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.749 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Ensure instance console log exists: /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.750 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.751 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.752 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:33 compute-1 sshd-session[293832]: Failed password for root from 192.210.160.141 port 55310 ssh2
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.798 2 DEBUG nova.network.neutron [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:38:33 compute-1 nova_compute[238822]: 2025-09-30 18:38:33.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:34 compute-1 sshd-session[293835]: Failed password for root from 14.225.167.110 port 53390 ssh2
Sep 30 18:38:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:38:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:34.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:38:34 compute-1 nova_compute[238822]: 2025-09-30 18:38:34.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:34 compute-1 nova_compute[238822]: 2025-09-30 18:38:34.318 2 DEBUG oslo_concurrency.lockutils [req-3c8920f5-a934-4385-872b-ab56c29f7ee2 req-34bac09a-32c4-4572-82d3-afb20322f5a9 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:38:34 compute-1 nova_compute[238822]: 2025-09-30 18:38:34.319 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquired lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:38:34 compute-1 nova_compute[238822]: 2025-09-30 18:38:34.320 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:38:34 compute-1 sudo[294008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:38:34 compute-1 sudo[294008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:34 compute-1 sudo[294008]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:34 compute-1 podman[294032]: 2025-09-30 18:38:34.71349754 +0000 UTC m=+0.078486630 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:38:34 compute-1 nova_compute[238822]: 2025-09-30 18:38:34.910 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:38:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.122 2 WARNING neutronclient.v2_0.client [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:38:35 compute-1 sshd-session[293832]: Connection closed by authenticating user root 192.210.160.141 port 55310 [preauth]
Sep 30 18:38:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:35.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.290 2 DEBUG nova.network.neutron [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Updating instance_info_cache with network_info: [{"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:38:35 compute-1 ceph-mon[75484]: pgmap v1801: 353 pgs: 353 active+clean; 167 MiB data, 391 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 4.1 MiB/s wr, 163 op/s
Sep 30 18:38:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:38:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:38:35 compute-1 sshd-session[293835]: Received disconnect from 14.225.167.110 port 53390:11: Bye Bye [preauth]
Sep 30 18:38:35 compute-1 sshd-session[293835]: Disconnected from authenticating user root 14.225.167.110 port 53390 [preauth]
Sep 30 18:38:35 compute-1 podman[249638]: time="2025-09-30T18:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:38:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:38:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8360 "" "Go-http-client/1.1"
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.802 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Releasing lock "refresh_cache-1a3876d4-5b93-40b5-96f1-ba2c2a27428b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.803 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance network_info: |[{"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.807 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Start _get_guest_xml network_info=[{"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.813 2 WARNING nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.816 2 DEBUG nova.virt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1699442290', uuid='1a3876d4-5b93-40b5-96f1-ba2c2a27428b'), owner=OwnerMeta(userid='eda3e60f66494c8682f36b8a8fa20793', username='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin', projectid='003b1a96324d40b683381237c3cec243', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759257515.8163824) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.822 2 DEBUG nova.virt.libvirt.host [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.823 2 DEBUG nova.virt.libvirt.host [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.827 2 DEBUG nova.virt.libvirt.host [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.827 2 DEBUG nova.virt.libvirt.host [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.828 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.828 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.829 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.829 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.830 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.830 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.831 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.831 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.832 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.832 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.832 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.833 2 DEBUG nova.virt.hardware [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:38:35 compute-1 nova_compute[238822]: 2025-09-30 18:38:35.839 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:36.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:38:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2887908528' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.319 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2887908528' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.372 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.377 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:38:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/405714817' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.832 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.835 2 DEBUG nova.virt.libvirt.vif [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:38:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1699442290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1699442290',id=29,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='003b1a96324d40b683381237c3cec243',ramdisk_id='',reservation_id='r-j0scsr69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:38:31Z,user_data=None,user_id='eda3e60f66494c8682f36b8a8fa20793',uuid=1a3876d4-5b93-40b5-96f1-ba2c2a27428b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.836 2 DEBUG nova.network.os_vif_util [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converting VIF {"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.837 2 DEBUG nova.network.os_vif_util [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:38:36 compute-1 nova_compute[238822]: 2025-09-30 18:38:36.840 2 DEBUG nova.objects.instance [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1a3876d4-5b93-40b5-96f1-ba2c2a27428b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:38:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.352 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <uuid>1a3876d4-5b93-40b5-96f1-ba2c2a27428b</uuid>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <name>instance-0000001d</name>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1699442290</nova:name>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:38:35</nova:creationTime>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:38:37 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:38:37 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:user uuid="eda3e60f66494c8682f36b8a8fa20793">tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin</nova:user>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:project uuid="003b1a96324d40b683381237c3cec243">tempest-TestExecuteVmWorkloadBalanceStrategy-765295423</nova:project>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <nova:port uuid="df28957b-3291-47e1-8119-50e815a35337">
Sep 30 18:38:37 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <system>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="serial">1a3876d4-5b93-40b5-96f1-ba2c2a27428b</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="uuid">1a3876d4-5b93-40b5-96f1-ba2c2a27428b</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </system>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <os>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </os>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <features>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </features>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </source>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </source>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:38:37 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:3e:da:3f"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <target dev="tapdf28957b-32"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/console.log" append="off"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <video>
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </video>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:38:37 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:38:37 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:38:37 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:38:37 compute-1 nova_compute[238822]: </domain>
Sep 30 18:38:37 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.355 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Preparing to wait for external event network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.356 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.357 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.357 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.359 2 DEBUG nova.virt.libvirt.vif [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:38:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1699442290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1699442290',id=29,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='003b1a96324d40b683381237c3cec243',ramdisk_id='',reservation_id='r-j0scsr69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:38:31Z,user_data=None,user_id='eda3e60f66494c8682f36b8a8fa20793',uuid=1a3876d4-5b93-40b5-96f1-ba2c2a27428b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.360 2 DEBUG nova.network.os_vif_util [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converting VIF {"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.361 2 DEBUG nova.network.os_vif_util [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.362 2 DEBUG os_vif [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:38:37 compute-1 ceph-mon[75484]: pgmap v1802: 353 pgs: 353 active+clean; 167 MiB data, 391 MiB used, 40 GiB / 40 GiB avail; 352 KiB/s rd, 4.1 MiB/s wr, 91 op/s
Sep 30 18:38:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3144201013' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:38:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3144201013' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:38:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/405714817' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:38:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.365 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.367 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'de766067-83ef-5c2a-9621-91c286e42c72', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.377 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf28957b-32, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.377 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdf28957b-32, col_values=(('qos', UUID('38ec8e78-d2d7-4908-a3d6-b6e7c37ea2e1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.378 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdf28957b-32, col_values=(('external_ids', {'iface-id': 'df28957b-3291-47e1-8119-50e815a35337', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:da:3f', 'vm-uuid': '1a3876d4-5b93-40b5-96f1-ba2c2a27428b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 NetworkManager[45549]: <info>  [1759257517.3824] manager: (tapdf28957b-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:37 compute-1 nova_compute[238822]: 2025-09-30 18:38:37.389 2 INFO os_vif [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32')
Sep 30 18:38:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:38.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:38 compute-1 sudo[294120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:38:38 compute-1 sudo[294120]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:38 compute-1 sudo[294120]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:38 compute-1 nova_compute[238822]: 2025-09-30 18:38:38.943 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:38:38 compute-1 nova_compute[238822]: 2025-09-30 18:38:38.943 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:38:38 compute-1 nova_compute[238822]: 2025-09-30 18:38:38.944 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] No VIF found with MAC fa:16:3e:3e:da:3f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:38:38 compute-1 nova_compute[238822]: 2025-09-30 18:38:38.945 2 INFO nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Using config drive
Sep 30 18:38:38 compute-1 nova_compute[238822]: 2025-09-30 18:38:38.985 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:39.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:39 compute-1 ceph-mon[75484]: pgmap v1803: 353 pgs: 353 active+clean; 167 MiB data, 391 MiB used, 40 GiB / 40 GiB avail; 352 KiB/s rd, 4.1 MiB/s wr, 91 op/s
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.505 2 WARNING neutronclient.v2_0.client [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.766 2 INFO nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Creating config drive at /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.779 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmplmynag2r execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.935 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmplmynag2r" returned: 0 in 0.156s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.989 2 DEBUG nova.storage.rbd_utils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] rbd image 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:38:39 compute-1 nova_compute[238822]: 2025-09-30 18:38:39.996 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.225 2 DEBUG oslo_concurrency.processutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config 1a3876d4-5b93-40b5-96f1-ba2c2a27428b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.229s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.227 2 INFO nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Deleting local config drive /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b/disk.config because it was imported into RBD.
Sep 30 18:38:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:38:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:40.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:38:40 compute-1 kernel: tapdf28957b-32: entered promiscuous mode
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.3158] manager: (tapdf28957b-32): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Sep 30 18:38:40 compute-1 ovn_controller[135204]: 2025-09-30T18:38:40Z|00239|binding|INFO|Claiming lport df28957b-3291-47e1-8119-50e815a35337 for this chassis.
Sep 30 18:38:40 compute-1 ovn_controller[135204]: 2025-09-30T18:38:40Z|00240|binding|INFO|df28957b-3291-47e1-8119-50e815a35337: Claiming fa:16:3e:3e:da:3f 10.100.0.5
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.343 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:da:3f 10.100.0.5'], port_security=['fa:16:3e:3e:da:3f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1a3876d4-5b93-40b5-96f1-ba2c2a27428b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a12466c2-b0c7-418c-b73a-38db6de1f821', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef43a77-fc58-48dd-8195-5e83e09646ef, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=df28957b-3291-47e1-8119-50e815a35337) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.344 144543 INFO neutron.agent.ovn.metadata.agent [-] Port df28957b-3291-47e1-8119-50e815a35337 in datapath e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 bound to our chassis
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.347 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.374 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[356416cc-aeaf-4f5b-969d-1b20f82c33f5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.375 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape214ea0f-11 in ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:38:40 compute-1 systemd-udevd[294218]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.380 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape214ea0f-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.381 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[21cd6732-6124-4248-9a19-6f4e48b6f36b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.382 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9326f02e-7e59-44bc-a63c-18efe36a8108]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 systemd-machined[195911]: New machine qemu-22-instance-0000001d.
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.404 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[40e8d4c6-833d-440f-a833-c0299c3454b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.4070] device (tapdf28957b-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.4096] device (tapdf28957b-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:38:40 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 ovn_controller[135204]: 2025-09-30T18:38:40Z|00241|binding|INFO|Setting lport df28957b-3291-47e1-8119-50e815a35337 ovn-installed in OVS
Sep 30 18:38:40 compute-1 ovn_controller[135204]: 2025-09-30T18:38:40Z|00242|binding|INFO|Setting lport df28957b-3291-47e1-8119-50e815a35337 up in Southbound
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.426 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[798f19c9-e147-4a55-8fc7-bdd75a947348]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.485 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbc38f1-8f27-4ffa-a7ff-1e6cedaefeee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.492 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[dfabf60f-f78d-4098-bc15-d3ddd20570f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 systemd-udevd[294223]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.4951] manager: (tape214ea0f-10): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.550 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[23da4c1b-bf32-4886-ac6b-1206a07d440d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.553 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[75e7c925-de97-4a49-96e9-08b4e89056e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.575 2 DEBUG nova.compute.manager [req-ff091b08-b642-43ed-b8b5-729ce0b6ba76 req-7fd01940-0a90-4a72-9a57-56d60d29af3b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.576 2 DEBUG oslo_concurrency.lockutils [req-ff091b08-b642-43ed-b8b5-729ce0b6ba76 req-7fd01940-0a90-4a72-9a57-56d60d29af3b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.576 2 DEBUG oslo_concurrency.lockutils [req-ff091b08-b642-43ed-b8b5-729ce0b6ba76 req-7fd01940-0a90-4a72-9a57-56d60d29af3b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.577 2 DEBUG oslo_concurrency.lockutils [req-ff091b08-b642-43ed-b8b5-729ce0b6ba76 req-7fd01940-0a90-4a72-9a57-56d60d29af3b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.577 2 DEBUG nova.compute.manager [req-ff091b08-b642-43ed-b8b5-729ce0b6ba76 req-7fd01940-0a90-4a72-9a57-56d60d29af3b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Processing event network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.5989] device (tape214ea0f-10): carrier: link connected
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.612 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[dea5202c-13e9-4c54-ae2d-45dd01edd651]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.640 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e14a380e-66c4-4389-a484-725b93fcade4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape214ea0f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:e3:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1536217, 'reachable_time': 32368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294251, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.677 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5463f6-3506-4eb2-a77a-29973897a6e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:e3e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1536217, 'tstamp': 1536217}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294252, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.709 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[442db82f-4177-409e-9585-1b0a95140613]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape214ea0f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:e3:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1536217, 'reachable_time': 32368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294253, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.774 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[06d4ee0b-8402-487c-9c34-68a0ab8487b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.883 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[399076ba-9458-4be0-9710-ff70934a16d1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.886 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape214ea0f-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.887 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.887 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape214ea0f-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 kernel: tape214ea0f-10: entered promiscuous mode
Sep 30 18:38:40 compute-1 NetworkManager[45549]: <info>  [1759257520.8913] manager: (tape214ea0f-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.894 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape214ea0f-10, col_values=(('external_ids', {'iface-id': '475706b8-809c-4da9-92ac-7152f6d17fbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:38:40 compute-1 ovn_controller[135204]: 2025-09-30T18:38:40Z|00243|binding|INFO|Releasing lport 475706b8-809c-4da9-92ac-7152f6d17fbe from this chassis (sb_readonly=0)
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.899 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3c873517-1642-4af0-ba30-7ed26c153aae]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.900 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.901 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.901 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.901 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.905 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b3631dcb-2df8-43eb-a2fe-ee010dd2a7b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.906 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.907 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[23c80a04-6ac1-4414-bfc7-732e56978b6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.908 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:38:40 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:40.908 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'env', 'PROCESS_TAG=haproxy-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:38:40 compute-1 nova_compute[238822]: 2025-09-30 18:38:40.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:41.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:41 compute-1 ceph-mon[75484]: pgmap v1804: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 354 KiB/s rd, 4.1 MiB/s wr, 94 op/s
Sep 30 18:38:41 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.465 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:38:41 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.472 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:38:41 compute-1 podman[294328]: 2025-09-30 18:38:41.476703883 +0000 UTC m=+0.105535177 container create bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Sep 30 18:38:41 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.477 2 INFO nova.virt.libvirt.driver [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance spawned successfully.
Sep 30 18:38:41 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.478 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:38:41 compute-1 podman[294328]: 2025-09-30 18:38:41.418238532 +0000 UTC m=+0.047069916 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:38:41 compute-1 systemd[1]: Started libpod-conmon-bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9.scope.
Sep 30 18:38:41 compute-1 podman[294343]: 2025-09-30 18:38:41.581530161 +0000 UTC m=+0.105495537 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 18:38:41 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:38:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/566643a63bb6527a5838286a3760d66bf2c984f944566c5419edf508ee9de7c4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:38:41 compute-1 podman[294342]: 2025-09-30 18:38:41.610292064 +0000 UTC m=+0.115117875 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:38:41 compute-1 podman[294328]: 2025-09-30 18:38:41.621067133 +0000 UTC m=+0.249898457 container init bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:38:41 compute-1 podman[294328]: 2025-09-30 18:38:41.628086752 +0000 UTC m=+0.256918046 container start bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:38:41 compute-1 podman[294341]: 2025-09-30 18:38:41.632091229 +0000 UTC m=+0.134992239 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 18:38:41 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [NOTICE]   (294400) : New worker (294402) forked
Sep 30 18:38:41 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [NOTICE]   (294400) : Loading success.
Sep 30 18:38:41 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.998 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:41.999 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.000 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.001 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.002 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.003 2 DEBUG nova.virt.libvirt.driver [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:38:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:42.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.520 2 INFO nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Took 10.02 seconds to spawn the instance on the hypervisor.
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.520 2 DEBUG nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.664 2 DEBUG nova.compute.manager [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.665 2 DEBUG oslo_concurrency.lockutils [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.666 2 DEBUG oslo_concurrency.lockutils [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.666 2 DEBUG oslo_concurrency.lockutils [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.667 2 DEBUG nova.compute.manager [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] No waiting events found dispatching network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:38:42 compute-1 nova_compute[238822]: 2025-09-30 18:38:42.667 2 WARNING nova.compute.manager [req-97994c2d-21bc-4aab-90f7-108aa9aebe42 req-1786808c-0b7a-4468-9eab-e4ab90ddd954 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received unexpected event network-vif-plugged-df28957b-3291-47e1-8119-50e815a35337 for instance with vm_state active and task_state None.
Sep 30 18:38:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:43 compute-1 nova_compute[238822]: 2025-09-30 18:38:43.065 2 INFO nova.compute.manager [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Took 15.80 seconds to build instance.
Sep 30 18:38:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:43.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:43 compute-1 ceph-mon[75484]: pgmap v1805: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 338 KiB/s rd, 3.9 MiB/s wr, 89 op/s
Sep 30 18:38:43 compute-1 nova_compute[238822]: 2025-09-30 18:38:43.580 2 DEBUG oslo_concurrency.lockutils [None req-9d4d148f-a33d-41be-9695-f961f15bd6e4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.339s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:44 compute-1 nova_compute[238822]: 2025-09-30 18:38:44.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:44.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:45.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:45 compute-1 ceph-mon[75484]: pgmap v1806: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 163 op/s
Sep 30 18:38:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:38:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:46.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:38:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:47.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:47 compute-1 nova_compute[238822]: 2025-09-30 18:38:47.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:47 compute-1 ceph-mon[75484]: pgmap v1807: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 76 op/s
Sep 30 18:38:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:48.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:38:48 compute-1 nova_compute[238822]: 2025-09-30 18:38:48.577 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:38:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1976696295' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:49 compute-1 nova_compute[238822]: 2025-09-30 18:38:49.071 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:49.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:49 compute-1 nova_compute[238822]: 2025-09-30 18:38:49.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: ERROR   18:38:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: ERROR   18:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: ERROR   18:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: ERROR   18:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: ERROR   18:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:38:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:38:49 compute-1 ceph-mon[75484]: pgmap v1808: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 76 op/s
Sep 30 18:38:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1976696295' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.217 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.218 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:38:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:50.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.444 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.446 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.483 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.484 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4555MB free_disk=39.925785064697266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.485 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:50 compute-1 nova_compute[238822]: 2025-09-30 18:38:50.486 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3195171026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:38:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:51.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:38:51 compute-1 nova_compute[238822]: 2025-09-30 18:38:51.553 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 1a3876d4-5b93-40b5-96f1-ba2c2a27428b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:38:51 compute-1 nova_compute[238822]: 2025-09-30 18:38:51.554 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:38:51 compute-1 nova_compute[238822]: 2025-09-30 18:38:51.555 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:38:50 up  4:16,  0 user,  load average: 1.42, 0.62, 0.62\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_003b1a96324d40b683381237c3cec243': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:38:51 compute-1 nova_compute[238822]: 2025-09-30 18:38:51.588 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:38:51 compute-1 ceph-mon[75484]: pgmap v1809: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 28 KiB/s wr, 77 op/s
Sep 30 18:38:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:38:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2835959614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:52 compute-1 nova_compute[238822]: 2025-09-30 18:38:52.065 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:38:52 compute-1 nova_compute[238822]: 2025-09-30 18:38:52.074 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:38:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:52.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:52 compute-1 nova_compute[238822]: 2025-09-30 18:38:52.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:52 compute-1 nova_compute[238822]: 2025-09-30 18:38:52.601 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:38:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2835959614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:38:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:53 compute-1 nova_compute[238822]: 2025-09-30 18:38:53.115 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:38:53 compute-1 nova_compute[238822]: 2025-09-30 18:38:53.116 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.630s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:53.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:53 compute-1 ceph-mon[75484]: pgmap v1810: 353 pgs: 353 active+clean; 167 MiB data, 397 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 74 op/s
Sep 30 18:38:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/366044204' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:38:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:54 compute-1 nova_compute[238822]: 2025-09-30 18:38:54.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:54.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:54.407 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:38:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:54.407 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:38:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:38:54.408 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:38:54 compute-1 ceph-mon[75484]: pgmap v1811: 353 pgs: 353 active+clean; 188 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Sep 30 18:38:54 compute-1 ovn_controller[135204]: 2025-09-30T18:38:54Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:da:3f 10.100.0.5
Sep 30 18:38:54 compute-1 ovn_controller[135204]: 2025-09-30T18:38:54Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:da:3f 10.100.0.5
Sep 30 18:38:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:55.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:56 compute-1 nova_compute[238822]: 2025-09-30 18:38:56.116 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:56 compute-1 nova_compute[238822]: 2025-09-30 18:38:56.117 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:56 compute-1 nova_compute[238822]: 2025-09-30 18:38:56.117 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:56 compute-1 nova_compute[238822]: 2025-09-30 18:38:56.118 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:38:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:56.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:56 compute-1 ceph-mon[75484]: pgmap v1812: 353 pgs: 353 active+clean; 188 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 234 KiB/s rd, 2.0 MiB/s wr, 45 op/s
Sep 30 18:38:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:38:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:57.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:38:57 compute-1 nova_compute[238822]: 2025-09-30 18:38:57.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:38:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2757991385' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:38:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2757991385' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:38:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:58 compute-1 sshd-session[294473]: Invalid user test from 192.210.160.141 port 43932
Sep 30 18:38:58 compute-1 sshd-session[294473]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:38:58 compute-1 sshd-session[294473]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:38:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:38:58.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:58 compute-1 sudo[294477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:38:58 compute-1 sudo[294477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:38:58 compute-1 sudo[294477]: pam_unix(sudo:session): session closed for user root
Sep 30 18:38:58 compute-1 podman[294502]: 2025-09-30 18:38:58.815196573 +0000 UTC m=+0.086499055 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:38:58 compute-1 podman[294501]: 2025-09-30 18:38:58.863002848 +0000 UTC m=+0.133467468 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:38:58 compute-1 ceph-mon[75484]: pgmap v1813: 353 pgs: 353 active+clean; 188 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 234 KiB/s rd, 2.0 MiB/s wr, 45 op/s
Sep 30 18:38:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:38:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:38:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:38:59 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:38:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:38:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:38:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:38:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:38:59.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:38:59 compute-1 nova_compute[238822]: 2025-09-30 18:38:59.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:00 compute-1 nova_compute[238822]: 2025-09-30 18:39:00.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:00 compute-1 sshd-session[294473]: Failed password for invalid user test from 192.210.160.141 port 43932 ssh2
Sep 30 18:39:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:00.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:00 compute-1 sshd-session[294473]: Connection closed by invalid user test 192.210.160.141 port 43932 [preauth]
Sep 30 18:39:00 compute-1 ceph-mon[75484]: pgmap v1814: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:39:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:01 compute-1 nova_compute[238822]: 2025-09-30 18:39:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:01.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:02.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:02 compute-1 nova_compute[238822]: 2025-09-30 18:39:02.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:03 compute-1 ceph-mon[75484]: pgmap v1815: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:39:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:03.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:04 compute-1 nova_compute[238822]: 2025-09-30 18:39:04.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:04.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:04 compute-1 sshd-session[294560]: Invalid user ho from 8.243.64.201 port 52966
Sep 30 18:39:04 compute-1 sshd-session[294560]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:39:04 compute-1 sshd-session[294560]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:39:04 compute-1 podman[294562]: 2025-09-30 18:39:04.924554624 +0000 UTC m=+0.083980798 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:39:05 compute-1 ceph-mon[75484]: pgmap v1816: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:39:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:05.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:05 compute-1 podman[249638]: time="2025-09-30T18:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:39:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:39:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8825 "" "Go-http-client/1.1"
Sep 30 18:39:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:06.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:07 compute-1 ceph-mon[75484]: pgmap v1817: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 91 KiB/s rd, 105 KiB/s wr, 20 op/s
Sep 30 18:39:07 compute-1 sshd-session[294560]: Failed password for invalid user ho from 8.243.64.201 port 52966 ssh2
Sep 30 18:39:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:07.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:07 compute-1 nova_compute[238822]: 2025-09-30 18:39:07.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:39:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:08.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:09 compute-1 ceph-mon[75484]: pgmap v1818: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 91 KiB/s rd, 105 KiB/s wr, 20 op/s
Sep 30 18:39:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:09.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:09 compute-1 nova_compute[238822]: 2025-09-30 18:39:09.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:09 compute-1 sshd-session[294560]: Received disconnect from 8.243.64.201 port 52966:11: Bye Bye [preauth]
Sep 30 18:39:09 compute-1 sshd-session[294560]: Disconnected from invalid user ho 8.243.64.201 port 52966 [preauth]
Sep 30 18:39:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:10.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:10 compute-1 ovn_controller[135204]: 2025-09-30T18:39:10Z|00244|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Sep 30 18:39:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:11 compute-1 ceph-mon[75484]: pgmap v1819: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 92 KiB/s rd, 105 KiB/s wr, 20 op/s
Sep 30 18:39:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:11.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:12.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:12 compute-1 nova_compute[238822]: 2025-09-30 18:39:12.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:12 compute-1 podman[294591]: 2025-09-30 18:39:12.580199694 +0000 UTC m=+0.106009030 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Sep 30 18:39:12 compute-1 podman[294592]: 2025-09-30 18:39:12.584913991 +0000 UTC m=+0.102604499 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:39:12 compute-1 podman[294590]: 2025-09-30 18:39:12.592607038 +0000 UTC m=+0.124790205 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 18:39:12 compute-1 nova_compute[238822]: 2025-09-30 18:39:12.762 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Creating tmpfile /var/lib/nova/instances/tmpmgf9x058 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:39:12 compute-1 nova_compute[238822]: 2025-09-30 18:39:12.763 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:12 compute-1 nova_compute[238822]: 2025-09-30 18:39:12.836 2 DEBUG nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmgf9x058',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:39:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.058 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.059 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.059 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.060 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.060 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:13 compute-1 nova_compute[238822]: 2025-09-30 18:39:13.060 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:13 compute-1 ceph-mon[75484]: pgmap v1820: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:39:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:13.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.078 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.078 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Image id 5b99cbca-b655-4be5-8343-cf504005c42e yields fingerprint cb2d580238c9b109feae7f1462613dc547671457 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.079 2 INFO nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] image 5b99cbca-b655-4be5-8343-cf504005c42e at (/var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457): checking
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.079 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] image 5b99cbca-b655-4be5-8343-cf504005c42e at (/var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.080 2 INFO oslo.privsep.daemon [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp9bdz83dj/privsep.sock']
Sep 30 18:39:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:14.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.872 2 INFO oslo.privsep.daemon [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Spawned new privsep daemon via rootwrap
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.721 6471 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.727 6471 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.730 6471 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.730 6471 INFO oslo.privsep.daemon [-] privsep daemon running as pid 6471
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.881 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.980 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.981 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] 1a3876d4-5b93-40b5-96f1-ba2c2a27428b is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.981 2 INFO nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Active base files: /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.981 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.982 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Sep 30 18:39:14 compute-1 nova_compute[238822]: 2025-09-30 18:39:14.982 2 DEBUG nova.virt.libvirt.imagecache [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Sep 30 18:39:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:15 compute-1 ceph-mon[75484]: pgmap v1821: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 16 KiB/s wr, 1 op/s
Sep 30 18:39:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:39:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:15.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:39:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:16.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:17.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:17 compute-1 ceph-mon[75484]: pgmap v1822: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:39:17 compute-1 nova_compute[238822]: 2025-09-30 18:39:17.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:18.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:18 compute-1 sudo[294665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:39:18 compute-1 sudo[294665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:18 compute-1 sudo[294665]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:19.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:19 compute-1 ceph-mon[75484]: pgmap v1823: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:39:19 compute-1 nova_compute[238822]: 2025-09-30 18:39:19.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:19 compute-1 nova_compute[238822]: 2025-09-30 18:39:19.396 2 DEBUG nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmgf9x058',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4f91975d-d44b-46af-9879-dbf7a693fbd2',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: ERROR   18:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: ERROR   18:39:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: ERROR   18:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: ERROR   18:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: ERROR   18:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:39:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:20.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:20 compute-1 nova_compute[238822]: 2025-09-30 18:39:20.421 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:39:20 compute-1 nova_compute[238822]: 2025-09-30 18:39:20.422 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:39:20 compute-1 nova_compute[238822]: 2025-09-30 18:39:20.423 2 DEBUG nova.network.neutron [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:39:20 compute-1 nova_compute[238822]: 2025-09-30 18:39:20.930 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:21.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:21 compute-1 ceph-mon[75484]: pgmap v1824: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:39:21 compute-1 nova_compute[238822]: 2025-09-30 18:39:21.880 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:22.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:22 compute-1 nova_compute[238822]: 2025-09-30 18:39:22.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:22 compute-1 nova_compute[238822]: 2025-09-30 18:39:22.683 2 DEBUG nova.network.neutron [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Updating instance_info_cache with network_info: [{"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:39:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.191 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.253 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmgf9x058',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4f91975d-d44b-46af-9879-dbf7a693fbd2',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.254 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Creating instance directory: /var/lib/nova/instances/4f91975d-d44b-46af-9879-dbf7a693fbd2 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.254 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Ensure instance console log exists: /var/lib/nova/instances/4f91975d-d44b-46af-9879-dbf7a693fbd2/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.255 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.257 2 DEBUG nova.virt.libvirt.vif [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:38:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-965053835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-965053835',id=28,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:38:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='003b1a96324d40b683381237c3cec243',ramdisk_id='',reservation_id='r-x8rgjbtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:38:21Z,user_data=None,user_id='eda3e60f66494c8682f36b8a8fa20793',uuid=4f91975d-d44b-46af-9879-dbf7a693fbd2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.258 2 DEBUG nova.network.os_vif_util [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.259 2 DEBUG nova.network.os_vif_util [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.259 2 DEBUG os_vif [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.261 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.261 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.263 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7cb96a31-e69f-5c2d-84ad-c4f4dfd38576', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.270 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e0ab88d-97, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.270 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6e0ab88d-97, col_values=(('qos', UUID('d06d406d-5400-4ff7-9e01-f6d074570a86')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.271 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6e0ab88d-97, col_values=(('external_ids', {'iface-id': '6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:be:81', 'vm-uuid': '4f91975d-d44b-46af-9879-dbf7a693fbd2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:39:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:23.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:39:23 compute-1 NetworkManager[45549]: <info>  [1759257563.2749] manager: (tap6e0ab88d-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.284 2 INFO os_vif [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97')
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.285 2 DEBUG nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.285 2 DEBUG nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmgf9x058',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4f91975d-d44b-46af-9879-dbf7a693fbd2',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.286 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:23 compute-1 ceph-mon[75484]: pgmap v1825: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:39:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.655 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:23 compute-1 nova_compute[238822]: 2025-09-30 18:39:23.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:23.925 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:39:23 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:23.928 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:39:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:24 compute-1 nova_compute[238822]: 2025-09-30 18:39:24.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:24.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:24 compute-1 nova_compute[238822]: 2025-09-30 18:39:24.652 2 DEBUG nova.network.neutron [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Port 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:39:24 compute-1 nova_compute[238822]: 2025-09-30 18:39:24.668 2 DEBUG nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmgf9x058',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4f91975d-d44b-46af-9879-dbf7a693fbd2',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:39:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:25 compute-1 unix_chkpwd[294703]: password check failed for user (root)
Sep 30 18:39:25 compute-1 sshd-session[294698]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:39:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:25 compute-1 ceph-mon[75484]: pgmap v1826: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:39:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:26.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:27 compute-1 sshd-session[294698]: Failed password for root from 192.210.160.141 port 53190 ssh2
Sep 30 18:39:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:27 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:39:27 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:39:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:27.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:27 compute-1 kernel: tap6e0ab88d-97: entered promiscuous mode
Sep 30 18:39:27 compute-1 NetworkManager[45549]: <info>  [1759257567.2824] manager: (tap6e0ab88d-97): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Sep 30 18:39:27 compute-1 ovn_controller[135204]: 2025-09-30T18:39:27Z|00245|binding|INFO|Claiming lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e for this additional chassis.
Sep 30 18:39:27 compute-1 ovn_controller[135204]: 2025-09-30T18:39:27Z|00246|binding|INFO|6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e: Claiming fa:16:3e:6c:be:81 10.100.0.11
Sep 30 18:39:27 compute-1 nova_compute[238822]: 2025-09-30 18:39:27.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.294 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:be:81 10.100.0.11'], port_security=['fa:16:3e:6c:be:81 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4f91975d-d44b-46af-9879-dbf7a693fbd2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a12466c2-b0c7-418c-b73a-38db6de1f821', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef43a77-fc58-48dd-8195-5e83e09646ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.296 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e in datapath e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 unbound from our chassis
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.297 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4
Sep 30 18:39:27 compute-1 ovn_controller[135204]: 2025-09-30T18:39:27Z|00247|binding|INFO|Setting lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e ovn-installed in OVS
Sep 30 18:39:27 compute-1 nova_compute[238822]: 2025-09-30 18:39:27.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:27 compute-1 nova_compute[238822]: 2025-09-30 18:39:27.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.323 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e16a6c5f-f355-4075-955a-c11d98f693ca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 systemd-machined[195911]: New machine qemu-23-instance-0000001c.
Sep 30 18:39:27 compute-1 systemd-udevd[294738]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:39:27 compute-1 systemd[1]: Started Virtual Machine qemu-23-instance-0000001c.
Sep 30 18:39:27 compute-1 ceph-mon[75484]: pgmap v1827: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:39:27 compute-1 NetworkManager[45549]: <info>  [1759257567.3768] device (tap6e0ab88d-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.377 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[707d25ff-5e37-467f-a8cb-d674bb672dfb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 NetworkManager[45549]: <info>  [1759257567.3798] device (tap6e0ab88d-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.381 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[f0acd1cf-d41b-4cb2-8ba6-23903813824e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.427 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9a5d95-c78d-4747-b2a6-9d2c9019152b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.458 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7613bbce-0102-44e6-8eef-766bbee36309]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape214ea0f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:e3:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1536217, 'reachable_time': 17922, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294749, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.485 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d42e1a12-a090-43db-a8d9-17cb65fb2d20]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape214ea0f-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1536239, 'tstamp': 1536239}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294751, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape214ea0f-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1536245, 'tstamp': 1536245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294751, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.487 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape214ea0f-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:27 compute-1 nova_compute[238822]: 2025-09-30 18:39:27.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:27 compute-1 nova_compute[238822]: 2025-09-30 18:39:27.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.491 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape214ea0f-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.492 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.492 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape214ea0f-10, col_values=(('external_ids', {'iface-id': '475706b8-809c-4da9-92ac-7152f6d17fbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.493 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:39:27 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:27.495 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8ced743f-96da-4951-91f6-1083c13b7dac]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:28 compute-1 sshd-session[294698]: Connection closed by authenticating user root 192.210.160.141 port 53190 [preauth]
Sep 30 18:39:28 compute-1 nova_compute[238822]: 2025-09-30 18:39:28.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:28.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:29.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:29 compute-1 nova_compute[238822]: 2025-09-30 18:39:29.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:29 compute-1 ceph-mon[75484]: pgmap v1828: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:39:29 compute-1 podman[294797]: 2025-09-30 18:39:29.590345335 +0000 UTC m=+0.118016023 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:39:29 compute-1 podman[294796]: 2025-09-30 18:39:29.602868271 +0000 UTC m=+0.138272057 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 18:39:29 compute-1 ovn_controller[135204]: 2025-09-30T18:39:29Z|00248|binding|INFO|Claiming lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e for this chassis.
Sep 30 18:39:29 compute-1 ovn_controller[135204]: 2025-09-30T18:39:29Z|00249|binding|INFO|6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e: Claiming fa:16:3e:6c:be:81 10.100.0.11
Sep 30 18:39:29 compute-1 ovn_controller[135204]: 2025-09-30T18:39:29Z|00250|binding|INFO|Setting lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e up in Southbound
Sep 30 18:39:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:30.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.844 2 INFO nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Post operation of migration started
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.845 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.920 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.920 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.996 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.997 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:39:30 compute-1 nova_compute[238822]: 2025-09-30 18:39:30.997 2 DEBUG nova.network.neutron [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:31.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:31 compute-1 ceph-mon[75484]: pgmap v1829: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 11 KiB/s wr, 7 op/s
Sep 30 18:39:31 compute-1 nova_compute[238822]: 2025-09-30 18:39:31.504 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:31 compute-1 nova_compute[238822]: 2025-09-30 18:39:31.841 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:31 compute-1 nova_compute[238822]: 2025-09-30 18:39:31.986 2 DEBUG nova.network.neutron [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Updating instance_info_cache with network_info: [{"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:39:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:32.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:32 compute-1 nova_compute[238822]: 2025-09-30 18:39:32.494 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-4f91975d-d44b-46af-9879-dbf7a693fbd2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:39:33 compute-1 nova_compute[238822]: 2025-09-30 18:39:33.014 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:33 compute-1 nova_compute[238822]: 2025-09-30 18:39:33.014 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:33 compute-1 nova_compute[238822]: 2025-09-30 18:39:33.015 2 DEBUG oslo_concurrency.lockutils [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:33 compute-1 nova_compute[238822]: 2025-09-30 18:39:33.022 2 INFO nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:39:33 compute-1 virtqemud[239124]: Domain id=23 name='instance-0000001c' uuid=4f91975d-d44b-46af-9879-dbf7a693fbd2 is tainted: custom-monitor
Sep 30 18:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:33 compute-1 nova_compute[238822]: 2025-09-30 18:39:33.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:33.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:33 compute-1 ceph-mon[75484]: pgmap v1830: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:39:33 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:33.930 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:34 compute-1 nova_compute[238822]: 2025-09-30 18:39:34.031 2 INFO nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:34.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:34 compute-1 nova_compute[238822]: 2025-09-30 18:39:34.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:34 compute-1 sudo[294848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:39:34 compute-1 sudo[294848]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:34 compute-1 sudo[294848]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:34 compute-1 sudo[294873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:39:34 compute-1 sudo[294873]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:35 compute-1 nova_compute[238822]: 2025-09-30 18:39:35.038 2 INFO nova.virt.libvirt.driver [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:35 compute-1 nova_compute[238822]: 2025-09-30 18:39:35.044 2 DEBUG nova.compute.manager [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:39:35 compute-1 podman[294898]: 2025-09-30 18:39:35.081431131 +0000 UTC m=+0.091765327 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:39:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:35.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:35 compute-1 sudo[294873]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:35 compute-1 ceph-mon[75484]: pgmap v1831: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:39:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:35 compute-1 sudo[294942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:39:35 compute-1 sudo[294942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:35 compute-1 sudo[294942]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:35 compute-1 nova_compute[238822]: 2025-09-30 18:39:35.556 2 DEBUG nova.objects.instance [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:39:35 compute-1 sudo[294967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:39:35 compute-1 sudo[294967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:35 compute-1 podman[249638]: time="2025-09-30T18:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:39:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:39:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8826 "" "Go-http-client/1.1"
Sep 30 18:39:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:36.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:36 compute-1 sudo[294967]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:36 compute-1 sudo[295022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:39:36 compute-1 sudo[295022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:36 compute-1 sudo[295022]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:36 compute-1 nova_compute[238822]: 2025-09-30 18:39:36.580 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:36 compute-1 sudo[295047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 63d32c6a-fa18-54ed-8711-9a3915cc367b -- inventory --format=json-pretty --filter-for-batch
Sep 30 18:39:36 compute-1 sudo[295047]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:36 compute-1 nova_compute[238822]: 2025-09-30 18:39:36.674 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:36 compute-1 nova_compute[238822]: 2025-09-30 18:39:36.674 2 WARNING neutronclient.v2_0.client [None req-a1cc36d7-36d8-4ec7-a1db-83dad0f9e880 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.115919719 +0000 UTC m=+0.058165505 container create 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.089606751 +0000 UTC m=+0.031852517 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:39:37 compute-1 systemd[1]: Started libpod-conmon-1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599.scope.
Sep 30 18:39:37 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:39:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:37.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.331361969 +0000 UTC m=+0.273607755 container init 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.345737835 +0000 UTC m=+0.287983621 container start 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Sep 30 18:39:37 compute-1 nifty_northcutt[295129]: 167 167
Sep 30 18:39:37 compute-1 systemd[1]: libpod-1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599.scope: Deactivated successfully.
Sep 30 18:39:37 compute-1 conmon[295129]: conmon 1a88b60227dd9ea52c9a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599.scope/container/memory.events
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.455300159 +0000 UTC m=+0.397545995 container attach 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.457442487 +0000 UTC m=+0.399688273 container died 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Sep 30 18:39:37 compute-1 ceph-mon[75484]: pgmap v1832: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:39:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2307088691' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:39:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2307088691' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:39:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:39:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-cd6f6fcf907660d9b89fc372ca41749b0110b894569ba1f41acea18096ec3a2c-merged.mount: Deactivated successfully.
Sep 30 18:39:37 compute-1 podman[295113]: 2025-09-30 18:39:37.793483168 +0000 UTC m=+0.735728954 container remove 1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_northcutt, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Sep 30 18:39:37 compute-1 systemd[1]: libpod-conmon-1a88b60227dd9ea52c9a32655d946685ef1a835a32e2e27beb90ee4b6ff13599.scope: Deactivated successfully.
Sep 30 18:39:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:38 compute-1 podman[295153]: 2025-09-30 18:39:38.145866909 +0000 UTC m=+0.105615330 container create bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Sep 30 18:39:38 compute-1 podman[295153]: 2025-09-30 18:39:38.093110201 +0000 UTC m=+0.052858662 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Sep 30 18:39:38 compute-1 systemd[1]: Started libpod-conmon-bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f.scope.
Sep 30 18:39:38 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:39:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11aa9458f171575274670645d3982a2af53d5f27ce670deb08a387ee57f1eb55/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Sep 30 18:39:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11aa9458f171575274670645d3982a2af53d5f27ce670deb08a387ee57f1eb55/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Sep 30 18:39:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11aa9458f171575274670645d3982a2af53d5f27ce670deb08a387ee57f1eb55/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Sep 30 18:39:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11aa9458f171575274670645d3982a2af53d5f27ce670deb08a387ee57f1eb55/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Sep 30 18:39:38 compute-1 podman[295153]: 2025-09-30 18:39:38.276703835 +0000 UTC m=+0.236452286 container init bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True)
Sep 30 18:39:38 compute-1 nova_compute[238822]: 2025-09-30 18:39:38.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:38 compute-1 podman[295153]: 2025-09-30 18:39:38.282511601 +0000 UTC m=+0.242259972 container start bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 18:39:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:38 compute-1 podman[295153]: 2025-09-30 18:39:38.342831802 +0000 UTC m=+0.302580263 container attach bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Sep 30 18:39:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:38.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:38 compute-1 ceph-mon[75484]: pgmap v1833: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:39:38 compute-1 sudo[295187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:39:38 compute-1 sudo[295187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:38 compute-1 sudo[295187]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:39:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:39.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:39:39 compute-1 nova_compute[238822]: 2025-09-30 18:39:39.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]: [
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:     {
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "available": false,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "being_replaced": false,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "ceph_device_lvm": false,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "device_id": "QEMU_DVD-ROM_QM00001",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "lsm_data": {},
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "lvs": [],
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "path": "/dev/sr0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "rejected_reasons": [
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "Has a FileSystem",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "Insufficient space (<5GB)"
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         ],
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         "sys_api": {
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "actuators": null,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "device_nodes": [
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:                 "sr0"
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             ],
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "devname": "sr0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "human_readable_size": "482.00 KB",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "id_bus": "ata",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "model": "QEMU DVD-ROM",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "nr_requests": "2",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "parent": "/dev/sr0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "partitions": {},
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "path": "/dev/sr0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "removable": "1",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "rev": "2.5+",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "ro": "0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "rotational": "0",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "sas_address": "",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "sas_device_handle": "",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "scheduler_mode": "mq-deadline",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "sectors": 0,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "sectorsize": "2048",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "size": 493568.0,
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "support_discard": "2048",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "type": "disk",
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:             "vendor": "QEMU"
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:         }
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]:     }
Sep 30 18:39:39 compute-1 admiring_wescoff[295170]: ]
Sep 30 18:39:39 compute-1 systemd[1]: libpod-bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f.scope: Deactivated successfully.
Sep 30 18:39:39 compute-1 podman[295153]: 2025-09-30 18:39:39.458349252 +0000 UTC m=+1.418097633 container died bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325)
Sep 30 18:39:39 compute-1 systemd[1]: libpod-bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f.scope: Consumed 1.058s CPU time.
Sep 30 18:39:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-11aa9458f171575274670645d3982a2af53d5f27ce670deb08a387ee57f1eb55-merged.mount: Deactivated successfully.
Sep 30 18:39:39 compute-1 podman[295153]: 2025-09-30 18:39:39.741918873 +0000 UTC m=+1.701667284 container remove bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=admiring_wescoff, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:39:39 compute-1 systemd[1]: libpod-conmon-bb0f14b939b3b2f573c24bc3348255487924b405e6b3f4dd58065315f835975f.scope: Deactivated successfully.
Sep 30 18:39:39 compute-1 sudo[295047]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3346492281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:40.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:40 compute-1 ceph-mon[75484]: pgmap v1834: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:39:40 compute-1 ceph-mon[75484]: pgmap v1835: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 810 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:39:40 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:41.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.414 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.415 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.416 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.416 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.417 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.436 2 INFO nova.compute.manager [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Terminating instance
Sep 30 18:39:41 compute-1 nova_compute[238822]: 2025-09-30 18:39:41.955 2 DEBUG nova.compute.manager [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:39:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3532344853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:42 compute-1 kernel: tapdf28957b-32 (unregistering): left promiscuous mode
Sep 30 18:39:42 compute-1 NetworkManager[45549]: <info>  [1759257582.0309] device (tapdf28957b-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:39:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:42 compute-1 ovn_controller[135204]: 2025-09-30T18:39:42Z|00251|binding|INFO|Releasing lport df28957b-3291-47e1-8119-50e815a35337 from this chassis (sb_readonly=0)
Sep 30 18:39:42 compute-1 ovn_controller[135204]: 2025-09-30T18:39:42Z|00252|binding|INFO|Setting lport df28957b-3291-47e1-8119-50e815a35337 down in Southbound
Sep 30 18:39:42 compute-1 ovn_controller[135204]: 2025-09-30T18:39:42Z|00253|binding|INFO|Removing iface tapdf28957b-32 ovn-installed in OVS
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.055 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:da:3f 10.100.0.5'], port_security=['fa:16:3e:3e:da:3f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1a3876d4-5b93-40b5-96f1-ba2c2a27428b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a12466c2-b0c7-418c-b73a-38db6de1f821', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef43a77-fc58-48dd-8195-5e83e09646ef, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=df28957b-3291-47e1-8119-50e815a35337) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.057 144543 INFO neutron.agent.ovn.metadata.agent [-] Port df28957b-3291-47e1-8119-50e815a35337 in datapath e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 unbound from our chassis
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.059 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.091 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcd1e54-048a-4681-a67f-34a6edb9e8fa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Sep 30 18:39:42 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 16.134s CPU time.
Sep 30 18:39:42 compute-1 systemd-machined[195911]: Machine qemu-22-instance-0000001d terminated.
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.142 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[2de1c1b6-0aba-4a01-8972-d23203cad2f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.146 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3c6ec0-aece-4dfd-a42a-2f1c6261d025]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 sshd-session[296468]: Invalid user aman from 167.172.43.167 port 57642
Sep 30 18:39:42 compute-1 sshd-session[296468]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:39:42 compute-1 sshd-session[296468]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.196 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[d99a166e-bb2a-4e42-a640-b45421cd28f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.214 2 INFO nova.virt.libvirt.driver [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Instance destroyed successfully.
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.215 2 DEBUG nova.objects.instance [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lazy-loading 'resources' on Instance uuid 1a3876d4-5b93-40b5-96f1-ba2c2a27428b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.221 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[16860df7-cfc5-4ec0-96be-24743c0bd1f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape214ea0f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:e3:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1536217, 'reachable_time': 17922, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296490, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.245 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e62384f4-95af-4cb5-a9b2-ade7c9f56c0e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape214ea0f-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1536239, 'tstamp': 1536239}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296495, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape214ea0f-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1536245, 'tstamp': 1536245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296495, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.248 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape214ea0f-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.257 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape214ea0f-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.257 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.258 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape214ea0f-10, col_values=(('external_ids', {'iface-id': '475706b8-809c-4da9-92ac-7152f6d17fbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.258 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:39:42 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:42.260 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0426f32c-eddc-407a-b807-43424aaf41ec]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:42.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.724 2 DEBUG nova.virt.libvirt.vif [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:38:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1699442290',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1699442290',id=29,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:38:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='003b1a96324d40b683381237c3cec243',ramdisk_id='',reservation_id='r-j0scsr69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:38:42Z,user_data=None,user_id='eda3e60f66494c8682f36b8a8fa20793',uuid=1a3876d4-5b93-40b5-96f1-ba2c2a27428b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.725 2 DEBUG nova.network.os_vif_util [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converting VIF {"id": "df28957b-3291-47e1-8119-50e815a35337", "address": "fa:16:3e:3e:da:3f", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf28957b-32", "ovs_interfaceid": "df28957b-3291-47e1-8119-50e815a35337", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.725 2 DEBUG nova.network.os_vif_util [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.726 2 DEBUG os_vif [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf28957b-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.735 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=38ec8e78-d2d7-4908-a3d6-b6e7c37ea2e1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.740 2 INFO os_vif [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:da:3f,bridge_name='br-int',has_traffic_filtering=True,id=df28957b-3291-47e1-8119-50e815a35337,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf28957b-32')
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.772 2 DEBUG nova.compute.manager [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.773 2 DEBUG oslo_concurrency.lockutils [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.773 2 DEBUG oslo_concurrency.lockutils [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.773 2 DEBUG oslo_concurrency.lockutils [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.774 2 DEBUG nova.compute.manager [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] No waiting events found dispatching network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:39:42 compute-1 nova_compute[238822]: 2025-09-30 18:39:42.774 2 DEBUG nova.compute.manager [req-01bb6424-6601-4351-83dd-be8257ae243f req-09886a02-1ff4-43b6-9508-5ae2575eae80 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:39:42 compute-1 ceph-mon[75484]: pgmap v1836: 353 pgs: 353 active+clean; 200 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 810 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:39:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.187 2 INFO nova.virt.libvirt.driver [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Deleting instance files /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b_del
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.188 2 INFO nova.virt.libvirt.driver [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Deletion of /var/lib/nova/instances/1a3876d4-5b93-40b5-96f1-ba2c2a27428b_del complete
Sep 30 18:39:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:43.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:43 compute-1 podman[296517]: 2025-09-30 18:39:43.560835097 +0000 UTC m=+0.094297765 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:39:43 compute-1 podman[296519]: 2025-09-30 18:39:43.570055005 +0000 UTC m=+0.096977797 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:39:43 compute-1 podman[296518]: 2025-09-30 18:39:43.589916099 +0000 UTC m=+0.124585190 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.707 2 INFO nova.compute.manager [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Took 1.75 seconds to destroy the instance on the hypervisor.
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.708 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.708 2 DEBUG nova.compute.manager [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.708 2 DEBUG nova.network.neutron [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:39:43 compute-1 nova_compute[238822]: 2025-09-30 18:39:43.709 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:44 compute-1 sshd-session[296468]: Failed password for invalid user aman from 167.172.43.167 port 57642 ssh2
Sep 30 18:39:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:44.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:44 compute-1 sshd-session[296468]: Received disconnect from 167.172.43.167 port 57642:11: Bye Bye [preauth]
Sep 30 18:39:44 compute-1 sshd-session[296468]: Disconnected from invalid user aman 167.172.43.167 port 57642 [preauth]
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.656 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.818 2 DEBUG nova.compute.manager [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.818 2 DEBUG oslo_concurrency.lockutils [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.818 2 DEBUG oslo_concurrency.lockutils [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.818 2 DEBUG oslo_concurrency.lockutils [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.818 2 DEBUG nova.compute.manager [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] No waiting events found dispatching network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:39:44 compute-1 nova_compute[238822]: 2025-09-30 18:39:44.819 2 DEBUG nova.compute.manager [req-371a597b-94f6-4c46-9079-49bebe72c9a0 req-e89a0615-4572-4018-821d-647d585c4817 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-unplugged-df28957b-3291-47e1-8119-50e815a35337 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:39:45 compute-1 sudo[296575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:39:45 compute-1 sudo[296575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:45 compute-1 ceph-mon[75484]: pgmap v1837: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:39:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:39:45 compute-1 sudo[296575]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:39:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:45.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:39:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:46.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:46 compute-1 nova_compute[238822]: 2025-09-30 18:39:46.732 2 DEBUG nova.compute.manager [req-3ff322b2-fb8f-45b7-8160-0077bb571d86 req-65b1b4d1-56e0-47c0-8829-801ebcad3e85 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Received event network-vif-deleted-df28957b-3291-47e1-8119-50e815a35337 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:46 compute-1 nova_compute[238822]: 2025-09-30 18:39:46.732 2 INFO nova.compute.manager [req-3ff322b2-fb8f-45b7-8160-0077bb571d86 req-65b1b4d1-56e0-47c0-8829-801ebcad3e85 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Neutron deleted interface df28957b-3291-47e1-8119-50e815a35337; detaching it from the instance and deleting it from the info cache
Sep 30 18:39:46 compute-1 nova_compute[238822]: 2025-09-30 18:39:46.733 2 DEBUG nova.network.neutron [req-3ff322b2-fb8f-45b7-8160-0077bb571d86 req-65b1b4d1-56e0-47c0-8829-801ebcad3e85 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:39:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:47 compute-1 ceph-mon[75484]: pgmap v1838: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:39:47 compute-1 nova_compute[238822]: 2025-09-30 18:39:47.166 2 DEBUG nova.network.neutron [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:39:47 compute-1 nova_compute[238822]: 2025-09-30 18:39:47.240 2 DEBUG nova.compute.manager [req-3ff322b2-fb8f-45b7-8160-0077bb571d86 req-65b1b4d1-56e0-47c0-8829-801ebcad3e85 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Detach interface failed, port_id=df28957b-3291-47e1-8119-50e815a35337, reason: Instance 1a3876d4-5b93-40b5-96f1-ba2c2a27428b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:39:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:47.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:47 compute-1 nova_compute[238822]: 2025-09-30 18:39:47.675 2 INFO nova.compute.manager [-] [instance: 1a3876d4-5b93-40b5-96f1-ba2c2a27428b] Took 3.97 seconds to deallocate network for instance.
Sep 30 18:39:47 compute-1 nova_compute[238822]: 2025-09-30 18:39:47.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:48 compute-1 nova_compute[238822]: 2025-09-30 18:39:48.198 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:48 compute-1 nova_compute[238822]: 2025-09-30 18:39:48.199 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:48 compute-1 nova_compute[238822]: 2025-09-30 18:39:48.270 2 DEBUG oslo_concurrency.processutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:39:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:48.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:39:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/434357496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:48 compute-1 nova_compute[238822]: 2025-09-30 18:39:48.706 2 DEBUG oslo_concurrency.processutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:39:48 compute-1 nova_compute[238822]: 2025-09-30 18:39:48.717 2 DEBUG nova.compute.provider_tree [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:39:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:49 compute-1 ceph-mon[75484]: pgmap v1839: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:39:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/434357496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.229 2 DEBUG nova.scheduler.client.report [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:39:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:39:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:49.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: ERROR   18:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: ERROR   18:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: ERROR   18:39:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: ERROR   18:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: ERROR   18:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:39:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.741 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.542s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.771 2 INFO nova.scheduler.client.report [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Deleted allocations for instance 1a3876d4-5b93-40b5-96f1-ba2c2a27428b
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.981 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:49 compute-1 nova_compute[238822]: 2025-09-30 18:39:49.981 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:39:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:50 compute-1 nova_compute[238822]: 2025-09-30 18:39:50.052 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:50.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:50 compute-1 nova_compute[238822]: 2025-09-30 18:39:50.568 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:50 compute-1 nova_compute[238822]: 2025-09-30 18:39:50.838 2 DEBUG oslo_concurrency.lockutils [None req-c264d336-c8e1-4092-abe6-aa146fbec24b eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "1a3876d4-5b93-40b5-96f1-ba2c2a27428b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.423s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.081 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.081 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.082 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.082 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.083 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:39:51 compute-1 ceph-mon[75484]: pgmap v1840: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 1.4 KiB/s wr, 33 op/s
Sep 30 18:39:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:39:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:51.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:39:51 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:39:51 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/217708246' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:51 compute-1 nova_compute[238822]: 2025-09-30 18:39:51.556 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:39:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/217708246' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/351520998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:52 compute-1 unix_chkpwd[296654]: password check failed for user (root)
Sep 30 18:39:52 compute-1 sshd-session[296627]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:39:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:52.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.609 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "4f91975d-d44b-46af-9879-dbf7a693fbd2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.611 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.612 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.612 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.612 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.621 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.621 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.631 2 INFO nova.compute.manager [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Terminating instance
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.879 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.881 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.917 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.918 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4469MB free_disk=39.94663619995117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.919 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:52 compute-1 nova_compute[238822]: 2025-09-30 18:39:52.919 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:53 compute-1 ceph-mon[75484]: pgmap v1841: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:39:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.153 2 DEBUG nova.compute.manager [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:39:53 compute-1 kernel: tap6e0ab88d-97 (unregistering): left promiscuous mode
Sep 30 18:39:53 compute-1 NetworkManager[45549]: <info>  [1759257593.2328] device (tap6e0ab88d-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:39:53 compute-1 ovn_controller[135204]: 2025-09-30T18:39:53Z|00254|binding|INFO|Releasing lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e from this chassis (sb_readonly=0)
Sep 30 18:39:53 compute-1 ovn_controller[135204]: 2025-09-30T18:39:53Z|00255|binding|INFO|Setting lport 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e down in Southbound
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 ovn_controller[135204]: 2025-09-30T18:39:53Z|00256|binding|INFO|Removing iface tap6e0ab88d-97 ovn-installed in OVS
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.258 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:be:81 10.100.0.11'], port_security=['fa:16:3e:6c:be:81 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4f91975d-d44b-46af-9879-dbf7a693fbd2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '003b1a96324d40b683381237c3cec243', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'a12466c2-b0c7-418c-b73a-38db6de1f821', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aef43a77-fc58-48dd-8195-5e83e09646ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.260 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e in datapath e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 unbound from our chassis
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.262 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.267 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3371d0a9-3a73-4bdd-bb9b-ca8a8af78535]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.268 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 namespace which is not needed anymore
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:53.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:53 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 18:39:53 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Consumed 2.596s CPU time.
Sep 30 18:39:53 compute-1 systemd-machined[195911]: Machine qemu-23-instance-0000001c terminated.
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.395 2 INFO nova.virt.libvirt.driver [-] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Instance destroyed successfully.
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.396 2 DEBUG nova.objects.instance [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lazy-loading 'resources' on Instance uuid 4f91975d-d44b-46af-9879-dbf7a693fbd2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:39:53 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [NOTICE]   (294400) : haproxy version is 3.0.5-8e879a5
Sep 30 18:39:53 compute-1 podman[296687]: 2025-09-30 18:39:53.447093753 +0000 UTC m=+0.042232326 container kill bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:39:53 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [NOTICE]   (294400) : path to executable is /usr/sbin/haproxy
Sep 30 18:39:53 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [WARNING]  (294400) : Exiting Master process...
Sep 30 18:39:53 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [ALERT]    (294400) : Current worker (294402) exited with code 143 (Terminated)
Sep 30 18:39:53 compute-1 neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4[294372]: [WARNING]  (294400) : All workers exited. Exiting... (0)
Sep 30 18:39:53 compute-1 systemd[1]: libpod-bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9.scope: Deactivated successfully.
Sep 30 18:39:53 compute-1 podman[296708]: 2025-09-30 18:39:53.511807832 +0000 UTC m=+0.039639256 container died bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:39:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9-userdata-shm.mount: Deactivated successfully.
Sep 30 18:39:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-566643a63bb6527a5838286a3760d66bf2c984f944566c5419edf508ee9de7c4-merged.mount: Deactivated successfully.
Sep 30 18:39:53 compute-1 podman[296708]: 2025-09-30 18:39:53.564571801 +0000 UTC m=+0.092403155 container cleanup bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:39:53 compute-1 systemd[1]: libpod-conmon-bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9.scope: Deactivated successfully.
Sep 30 18:39:53 compute-1 podman[296710]: 2025-09-30 18:39:53.591245027 +0000 UTC m=+0.107057428 container remove bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.600 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[36d75089-5d31-47e0-bc5d-ed9db1dfcc66]: (4, ("Tue Sep 30 06:39:53 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 (bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9)\nbf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9\nTue Sep 30 06:39:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 (bf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9)\nbf27ef620dce0421708541714022e3df1dedc9ee81f4d4875bfd6395b4eac3d9\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.601 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[174c731d-9a76-4d72-95ad-b452ccf20db9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.602 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.602 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[33cfe828-c849-4e1b-a1ed-da6283f0f583]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.603 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape214ea0f-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 kernel: tape214ea0f-10: left promiscuous mode
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.625 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[895d918a-60ad-4aa7-93a0-bcf5dd8ab669]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.664 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fd579fc2-9268-4ae7-af06-f36f6e37a737]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.665 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fd65db9b-57c7-4b04-98e8-ac4d7d518cff]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.685 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0964a536-7f93-4497-b881-70be0eddd036]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1536205, 'reachable_time': 35885, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296743, 'error': None, 'target': 'ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.691 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:39:53 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:53.692 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[ab957819-9daf-4f07-aefb-5db3601f0820]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:39:53 compute-1 systemd[1]: run-netns-ovnmeta\x2de214ea0f\x2d1cc7\x2d4ff8\x2dad9d\x2d642bd3724cf4.mount: Deactivated successfully.
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.826 2 DEBUG nova.compute.manager [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Received event network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.826 2 DEBUG oslo_concurrency.lockutils [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.827 2 DEBUG oslo_concurrency.lockutils [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.827 2 DEBUG oslo_concurrency.lockutils [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.828 2 DEBUG nova.compute.manager [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] No waiting events found dispatching network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.828 2 DEBUG nova.compute.manager [req-1d67f92b-25dc-4c5a-a304-50cc86a18391 req-d7fe7dbf-05d2-4b0c-a294-38dbee56af6c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Received event network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.904 2 DEBUG nova.virt.libvirt.vif [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:38:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-965053835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-965053835',id=28,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:38:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='003b1a96324d40b683381237c3cec243',ramdisk_id='',reservation_id='r-x8rgjbtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-765295423-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:39:36Z,user_data=None,user_id='eda3e60f66494c8682f36b8a8fa20793',uuid=4f91975d-d44b-46af-9879-dbf7a693fbd2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.904 2 DEBUG nova.network.os_vif_util [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converting VIF {"id": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "address": "fa:16:3e:6c:be:81", "network": {"id": "e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1931818952-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9692f6197b3545b1bf37bd84c3928d41", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e0ab88d-97", "ovs_interfaceid": "6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.905 2 DEBUG nova.network.os_vif_util [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.906 2 DEBUG os_vif [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.909 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e0ab88d-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.915 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d06d406d-5400-4ff7-9e01-f6d074570a86) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:53 compute-1 nova_compute[238822]: 2025-09-30 18:39:53.921 2 INFO os_vif [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:be:81,bridge_name='br-int',has_traffic_filtering=True,id=6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e,network=Network(e214ea0f-1cc7-4ff8-ad9d-642bd3724cf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e0ab88d-97')
Sep 30 18:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2408308363' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:54 compute-1 sshd-session[296627]: Failed password for root from 192.210.160.141 port 41230 ssh2
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:54.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:54.411 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:54.411 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:39:54.412 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.432 2 INFO nova.virt.libvirt.driver [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Deleting instance files /var/lib/nova/instances/4f91975d-d44b-46af-9879-dbf7a693fbd2_del
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.433 2 INFO nova.virt.libvirt.driver [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Deletion of /var/lib/nova/instances/4f91975d-d44b-46af-9879-dbf7a693fbd2_del complete
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.471 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 4f91975d-d44b-46af-9879-dbf7a693fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.471 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.471 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:39:52 up  4:17,  0 user,  load average: 0.68, 0.56, 0.59\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_003b1a96324d40b683381237c3cec243': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.523 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.947 2 INFO nova.compute.manager [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Took 1.79 seconds to destroy the instance on the hypervisor.
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.948 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.949 2 DEBUG nova.compute.manager [-] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.949 2 DEBUG nova.network.neutron [-] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:39:54 compute-1 nova_compute[238822]: 2025-09-30 18:39:54.950 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:39:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2147569816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.008 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.015 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:39:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:55 compute-1 ceph-mon[75484]: pgmap v1842: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:39:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2147569816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:55.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:55 compute-1 sshd-session[296627]: Connection closed by authenticating user root 192.210.160.141 port 41230 [preauth]
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.525 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.664 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.900 2 DEBUG nova.compute.manager [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Received event network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.900 2 DEBUG oslo_concurrency.lockutils [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.900 2 DEBUG oslo_concurrency.lockutils [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.901 2 DEBUG oslo_concurrency.lockutils [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.901 2 DEBUG nova.compute.manager [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] No waiting events found dispatching network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:39:55 compute-1 nova_compute[238822]: 2025-09-30 18:39:55.901 2 DEBUG nova.compute.manager [req-3f02bc5b-740d-4bcc-9396-9cfc0f40a5b7 req-aad94e78-994f-4da8-a8f2-4116c6280d26 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Received event network-vif-unplugged-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.039 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.039 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:39:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:56.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.491 2 DEBUG nova.network.neutron [-] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.528 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.528 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.528 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:56 compute-1 nova_compute[238822]: 2025-09-30 18:39:56.998 2 INFO nova.compute.manager [-] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Took 2.05 seconds to deallocate network for instance.
Sep 30 18:39:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:57 compute-1 nova_compute[238822]: 2025-09-30 18:39:57.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:39:57 compute-1 ceph-mon[75484]: pgmap v1843: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:39:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:57.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:57 compute-1 nova_compute[238822]: 2025-09-30 18:39:57.530 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:39:57 compute-1 nova_compute[238822]: 2025-09-30 18:39:57.531 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:39:57 compute-1 nova_compute[238822]: 2025-09-30 18:39:57.579 2 DEBUG oslo_concurrency.processutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:39:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:39:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2633899959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:57 compute-1 nova_compute[238822]: 2025-09-30 18:39:57.986 2 DEBUG nova.compute.manager [req-3365266e-2b13-4898-9a44-d86db0bdba3f req-0e96257f-b4f4-40ff-ba87-8c62ba7f8d72 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 4f91975d-d44b-46af-9879-dbf7a693fbd2] Received event network-vif-deleted-6e0ab88d-97b3-4ce8-9f18-d46c9c2fc74e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:39:58 compute-1 nova_compute[238822]: 2025-09-30 18:39:58.011 2 DEBUG oslo_concurrency.processutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:39:58 compute-1 nova_compute[238822]: 2025-09-30 18:39:58.019 2 DEBUG nova.compute.provider_tree [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:39:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/525515443' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:39:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/525515443' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:39:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2633899959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:39:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:39:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:39:58.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:39:58 compute-1 nova_compute[238822]: 2025-09-30 18:39:58.532 2 DEBUG nova.scheduler.client.report [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:39:58 compute-1 nova_compute[238822]: 2025-09-30 18:39:58.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:39:59 compute-1 nova_compute[238822]: 2025-09-30 18:39:59.044 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.513s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:39:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:39:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:39:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:39:59 compute-1 sudo[296814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:39:59 compute-1 sudo[296814]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:39:59 compute-1 nova_compute[238822]: 2025-09-30 18:39:59.069 2 INFO nova.scheduler.client.report [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Deleted allocations for instance 4f91975d-d44b-46af-9879-dbf7a693fbd2
Sep 30 18:39:59 compute-1 sudo[296814]: pam_unix(sudo:session): session closed for user root
Sep 30 18:39:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:39:59 compute-1 ceph-mon[75484]: pgmap v1844: 353 pgs: 353 active+clean; 121 MiB data, 376 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:39:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:39:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:39:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:39:59.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:39:59 compute-1 nova_compute[238822]: 2025-09-30 18:39:59.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:00 compute-1 nova_compute[238822]: 2025-09-30 18:40:00.138 2 DEBUG oslo_concurrency.lockutils [None req-32941ac7-42a1-4d31-8f62-228b7a4630b4 eda3e60f66494c8682f36b8a8fa20793 003b1a96324d40b683381237c3cec243 - - default default] Lock "4f91975d-d44b-46af-9879-dbf7a693fbd2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.527s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:40:00 compute-1 ceph-mon[75484]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Sep 30 18:40:00 compute-1 ceph-mon[75484]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Sep 30 18:40:00 compute-1 ceph-mon[75484]:      osd.1 observed slow operation indications in BlueStore
Sep 30 18:40:00 compute-1 ceph-mon[75484]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Sep 30 18:40:00 compute-1 ceph-mon[75484]:     daemon nfs.cephfs.0.0.compute-1.bsnzkg on compute-1 is in error state
Sep 30 18:40:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:40:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:00.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:40:00 compute-1 podman[296841]: 2025-09-30 18:40:00.552199855 +0000 UTC m=+0.085053966 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:40:00 compute-1 podman[296840]: 2025-09-30 18:40:00.680144074 +0000 UTC m=+0.220593389 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:40:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:01 compute-1 nova_compute[238822]: 2025-09-30 18:40:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:01 compute-1 nova_compute[238822]: 2025-09-30 18:40:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:01 compute-1 ceph-mon[75484]: pgmap v1845: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:40:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:01.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:02.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:03 compute-1 ceph-mon[75484]: pgmap v1846: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:40:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:03.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:03 compute-1 nova_compute[238822]: 2025-09-30 18:40:03.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:04 compute-1 nova_compute[238822]: 2025-09-30 18:40:04.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:04.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:04 compute-1 nova_compute[238822]: 2025-09-30 18:40:04.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:04 compute-1 nova_compute[238822]: 2025-09-30 18:40:04.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:05 compute-1 ceph-mon[75484]: pgmap v1847: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:40:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:05.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:05 compute-1 podman[296895]: 2025-09-30 18:40:05.559364894 +0000 UTC m=+0.097652335 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:40:05 compute-1 podman[249638]: time="2025-09-30T18:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:40:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:40:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8360 "" "Go-http-client/1.1"
Sep 30 18:40:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:07.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:07 compute-1 ceph-mon[75484]: pgmap v1848: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:40:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:40:07 compute-1 nova_compute[238822]: 2025-09-30 18:40:07.569 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:07 compute-1 nova_compute[238822]: 2025-09-30 18:40:07.569 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:08.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:08 compute-1 nova_compute[238822]: 2025-09-30 18:40:08.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:40:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:09.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:40:09 compute-1 nova_compute[238822]: 2025-09-30 18:40:09.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:09 compute-1 ceph-mon[75484]: pgmap v1849: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:40:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:10.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:11.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:11 compute-1 radosgw[84864]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Sep 30 18:40:11 compute-1 ceph-mon[75484]: pgmap v1850: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:40:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:12.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:13.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:13 compute-1 ceph-mon[75484]: pgmap v1851: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:13 compute-1 nova_compute[238822]: 2025-09-30 18:40:13.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:14.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:14 compute-1 nova_compute[238822]: 2025-09-30 18:40:14.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:14 compute-1 sshd-session[296923]: Invalid user pzserver from 8.243.64.201 port 38734
Sep 30 18:40:14 compute-1 sshd-session[296923]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:40:14 compute-1 sshd-session[296923]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:40:14 compute-1 nova_compute[238822]: 2025-09-30 18:40:14.564 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:14 compute-1 nova_compute[238822]: 2025-09-30 18:40:14.565 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:40:14 compute-1 podman[296928]: 2025-09-30 18:40:14.589755578 +0000 UTC m=+0.088074748 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Sep 30 18:40:14 compute-1 podman[296927]: 2025-09-30 18:40:14.589953944 +0000 UTC m=+0.092847887 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm)
Sep 30 18:40:14 compute-1 podman[296926]: 2025-09-30 18:40:14.608735118 +0000 UTC m=+0.119964465 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:40:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:15 compute-1 nova_compute[238822]: 2025-09-30 18:40:15.071 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:40:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:15.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:15 compute-1 ceph-mon[75484]: pgmap v1852: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 45 KiB/s rd, 0 B/s wr, 75 op/s
Sep 30 18:40:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:16.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:16 compute-1 sshd-session[296923]: Failed password for invalid user pzserver from 8.243.64.201 port 38734 ssh2
Sep 30 18:40:16 compute-1 sshd-session[296923]: Received disconnect from 8.243.64.201 port 38734:11: Bye Bye [preauth]
Sep 30 18:40:16 compute-1 sshd-session[296923]: Disconnected from invalid user pzserver 8.243.64.201 port 38734 [preauth]
Sep 30 18:40:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:16.722 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:bc:cf 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9029a2856de43388bcee1a38d165449', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55e305e6-0f4d-40bc-a70b-ac91f882ec57, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d2e69f29-6b3a-46dc-9ed7-12031e1b7d2b) old=Port_Binding(mac=['fa:16:3e:48:bc:cf'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9029a2856de43388bcee1a38d165449', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:40:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:16.724 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d2e69f29-6b3a-46dc-9ed7-12031e1b7d2b in datapath c8484b9b-b34e-4c32-b987-029d8fcb2a28 updated
Sep 30 18:40:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:16.724 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8484b9b-b34e-4c32-b987-029d8fcb2a28, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:40:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:16.725 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2489f2b6-77e2-4f04-948b-197062ed1506]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:40:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:17.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:17 compute-1 ceph-mon[75484]: pgmap v1853: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 45 KiB/s rd, 0 B/s wr, 74 op/s
Sep 30 18:40:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:18.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:18 compute-1 sshd-session[296989]: Invalid user ghost from 192.210.160.141 port 59140
Sep 30 18:40:18 compute-1 sshd-session[296989]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:40:18 compute-1 sshd-session[296989]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:40:18 compute-1 nova_compute[238822]: 2025-09-30 18:40:18.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:19 compute-1 sudo[296993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:40:19 compute-1 sudo[296993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:19 compute-1 sudo[296993]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:19.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: ERROR   18:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: ERROR   18:40:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: ERROR   18:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: ERROR   18:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: ERROR   18:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:40:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:40:19 compute-1 nova_compute[238822]: 2025-09-30 18:40:19.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:19 compute-1 ceph-mon[75484]: pgmap v1854: 353 pgs: 353 active+clean; 41 MiB data, 329 MiB used, 40 GiB / 40 GiB avail; 45 KiB/s rd, 0 B/s wr, 74 op/s
Sep 30 18:40:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:20.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:21 compute-1 sshd-session[296989]: Failed password for invalid user ghost from 192.210.160.141 port 59140 ssh2
Sep 30 18:40:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:21.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:21 compute-1 ceph-mon[75484]: pgmap v1855: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 88 KiB/s rd, 0 B/s wr, 146 op/s
Sep 30 18:40:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:22.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:40:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:23 compute-1 sshd-session[296989]: Connection closed by invalid user ghost 192.210.160.141 port 59140 [preauth]
Sep 30 18:40:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:40:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:40:23 compute-1 ceph-mon[75484]: pgmap v1856: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 88 KiB/s rd, 0 B/s wr, 145 op/s
Sep 30 18:40:23 compute-1 nova_compute[238822]: 2025-09-30 18:40:23.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:24 compute-1 nova_compute[238822]: 2025-09-30 18:40:24.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:24.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:24 compute-1 nova_compute[238822]: 2025-09-30 18:40:24.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:24.919 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:40:24 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:24.921 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:40:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:25 compute-1 ceph-mon[75484]: pgmap v1857: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 88 KiB/s rd, 0 B/s wr, 146 op/s
Sep 30 18:40:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:26.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:26 compute-1 ceph-mon[75484]: pgmap v1858: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 43 KiB/s rd, 0 B/s wr, 71 op/s
Sep 30 18:40:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:26.976 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:1c:b1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7a6956e-b7b4-4d5d-bbf4-3b65f9efc0ab, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6e704049-dbf5-46c8-b5c8-bf30b26c3ece) old=Port_Binding(mac=['fa:16:3e:c5:1c:b1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:40:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:26.977 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6e704049-dbf5-46c8-b5c8-bf30b26c3ece in datapath eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f updated
Sep 30 18:40:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:26.978 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb4c3e7d-3f25-4a36-a408-04f4c58c1e3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:40:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:26.980 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b1d452-047c-4844-9f22-71a2a60cac9a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:40:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:27.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:28.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:28 compute-1 nova_compute[238822]: 2025-09-30 18:40:28.851 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:28 compute-1 nova_compute[238822]: 2025-09-30 18:40:28.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:29 compute-1 ceph-mon[75484]: pgmap v1859: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 43 KiB/s rd, 0 B/s wr, 71 op/s
Sep 30 18:40:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:29.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:29 compute-1 nova_compute[238822]: 2025-09-30 18:40:29.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:40:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:30.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:40:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:31 compute-1 ceph-mon[75484]: pgmap v1860: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 43 KiB/s rd, 0 B/s wr, 71 op/s
Sep 30 18:40:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:31.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:31 compute-1 podman[297033]: 2025-09-30 18:40:31.58078504 +0000 UTC m=+0.115798083 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:40:31 compute-1 podman[297032]: 2025-09-30 18:40:31.611003513 +0000 UTC m=+0.146962631 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:32.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:32 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:32.922 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:40:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:33 compute-1 ceph-mon[75484]: pgmap v1861: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:33 compute-1 nova_compute[238822]: 2025-09-30 18:40:33.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:34.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:34 compute-1 nova_compute[238822]: 2025-09-30 18:40:34.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:35 compute-1 ceph-mon[75484]: pgmap v1862: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:40:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:40:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:35.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:40:35 compute-1 podman[249638]: time="2025-09-30T18:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:40:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:40:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:36.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:36 compute-1 podman[297082]: 2025-09-30 18:40:36.554113721 +0000 UTC m=+0.099778552 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 18:40:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:37 compute-1 ceph-mon[75484]: pgmap v1863: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1987975264' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:40:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1987975264' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:40:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:37.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:37 compute-1 ovn_controller[135204]: 2025-09-30T18:40:37Z|00257|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 18:40:38 compute-1 sshd-session[297101]: Invalid user user from 49.49.32.245 port 43978
Sep 30 18:40:38 compute-1 sshd-session[297101]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:40:38 compute-1 sshd-session[297101]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:40:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:40:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:38.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:38 compute-1 nova_compute[238822]: 2025-09-30 18:40:38.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:39 compute-1 sudo[297108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:40:39 compute-1 sudo[297108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:39 compute-1 sudo[297108]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:39 compute-1 ceph-mon[75484]: pgmap v1864: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:39.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:39 compute-1 nova_compute[238822]: 2025-09-30 18:40:39.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:39 compute-1 sshd-session[297101]: Failed password for invalid user user from 49.49.32.245 port 43978 ssh2
Sep 30 18:40:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:40.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:41 compute-1 ceph-mon[75484]: pgmap v1865: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:40:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2973203013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:41.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:41 compute-1 sshd-session[297135]: Invalid user admin from 161.132.50.17 port 59554
Sep 30 18:40:41 compute-1 sshd-session[297135]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:40:41 compute-1 sshd-session[297135]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:40:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:42 compute-1 sshd-session[297101]: Received disconnect from 49.49.32.245 port 43978:11: Bye Bye [preauth]
Sep 30 18:40:42 compute-1 sshd-session[297101]: Disconnected from invalid user user 49.49.32.245 port 43978 [preauth]
Sep 30 18:40:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:42.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:43 compute-1 ceph-mon[75484]: pgmap v1866: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:43.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:43 compute-1 nova_compute[238822]: 2025-09-30 18:40:43.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:44 compute-1 sshd-session[297135]: Failed password for invalid user admin from 161.132.50.17 port 59554 ssh2
Sep 30 18:40:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:44 compute-1 nova_compute[238822]: 2025-09-30 18:40:44.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:44 compute-1 sshd-session[297135]: Received disconnect from 161.132.50.17 port 59554:11: Bye Bye [preauth]
Sep 30 18:40:44 compute-1 sshd-session[297135]: Disconnected from invalid user admin 161.132.50.17 port 59554 [preauth]
Sep 30 18:40:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:45 compute-1 sudo[297142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:40:45 compute-1 sudo[297142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:45 compute-1 sudo[297142]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:45 compute-1 sudo[297181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:40:45 compute-1 sudo[297181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:45 compute-1 podman[297167]: 2025-09-30 18:40:45.340676525 +0000 UTC m=+0.104259713 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 18:40:45 compute-1 podman[297166]: 2025-09-30 18:40:45.356073049 +0000 UTC m=+0.128969027 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:40:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:45.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:45 compute-1 ceph-mon[75484]: pgmap v1867: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:40:45 compute-1 podman[297168]: 2025-09-30 18:40:45.379117878 +0000 UTC m=+0.136931411 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:40:46 compute-1 sudo[297181]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:40:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:40:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:46.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:47 compute-1 sshd-session[297267]: Invalid user flavia from 103.153.190.105 port 35781
Sep 30 18:40:47 compute-1 sshd-session[297267]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:40:47 compute-1 sshd-session[297267]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:40:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:47.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:47 compute-1 ceph-mon[75484]: pgmap v1868: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:40:47 compute-1 ceph-mon[75484]: pgmap v1869: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 611 B/s rd, 0 op/s
Sep 30 18:40:47 compute-1 unix_chkpwd[297287]: password check failed for user (root)
Sep 30 18:40:47 compute-1 sshd-session[297283]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:40:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:48.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:48 compute-1 nova_compute[238822]: 2025-09-30 18:40:48.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:49.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: ERROR   18:40:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: ERROR   18:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: ERROR   18:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: ERROR   18:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: ERROR   18:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:40:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:40:49 compute-1 sshd-session[297267]: Failed password for invalid user flavia from 103.153.190.105 port 35781 ssh2
Sep 30 18:40:49 compute-1 ceph-mon[75484]: pgmap v1870: 353 pgs: 353 active+clean; 41 MiB data, 335 MiB used, 40 GiB / 40 GiB avail; 611 B/s rd, 0 op/s
Sep 30 18:40:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4244829744' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:40:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/76621830' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:40:49 compute-1 nova_compute[238822]: 2025-09-30 18:40:49.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:49 compute-1 sshd-session[297283]: Failed password for root from 192.210.160.141 port 39846 ssh2
Sep 30 18:40:50 compute-1 nova_compute[238822]: 2025-09-30 18:40:50.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:50 compute-1 nova_compute[238822]: 2025-09-30 18:40:50.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:40:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:50 compute-1 sshd-session[297283]: Connection closed by authenticating user root 192.210.160.141 port 39846 [preauth]
Sep 30 18:40:50 compute-1 sshd-session[297267]: Received disconnect from 103.153.190.105 port 35781:11: Bye Bye [preauth]
Sep 30 18:40:50 compute-1 sshd-session[297267]: Disconnected from invalid user flavia 103.153.190.105 port 35781 [preauth]
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:51 compute-1 sudo[297292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:40:51 compute-1 sudo[297292]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:51 compute-1 sudo[297292]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:51.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:51 compute-1 ceph-mon[75484]: pgmap v1871: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Sep 30 18:40:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:40:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.570 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.571 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:40:51 compute-1 nova_compute[238822]: 2025-09-30 18:40:51.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:40:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:40:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2830130152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.065 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:40:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.288 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.289 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.328 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.329 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4668MB free_disk=39.971431732177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.330 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:40:52 compute-1 nova_compute[238822]: 2025-09-30 18:40:52.331 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:40:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2830130152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:40:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000080s ======
Sep 30 18:40:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:52.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Sep 30 18:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:53.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.439 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.440 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:40:52 up  4:18,  0 user,  load average: 0.48, 0.52, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.485 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.526 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.527 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:40:53 compute-1 ceph-mon[75484]: pgmap v1872: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Sep 30 18:40:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3691236522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.542 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.560 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:40:53 compute-1 nova_compute[238822]: 2025-09-30 18:40:53.576 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.597960) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653598014, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 2356, "num_deletes": 251, "total_data_size": 6083604, "memory_usage": 6163200, "flush_reason": "Manual Compaction"}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653630480, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 3879275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49762, "largest_seqno": 52113, "table_properties": {"data_size": 3869881, "index_size": 5888, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19774, "raw_average_key_size": 20, "raw_value_size": 3850981, "raw_average_value_size": 3970, "num_data_blocks": 257, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257448, "oldest_key_time": 1759257448, "file_creation_time": 1759257653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 32805 microseconds, and 14694 cpu microseconds.
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.630761) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 3879275 bytes OK
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.630891) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.632927) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.632954) EVENT_LOG_v1 {"time_micros": 1759257653632945, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.632981) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 6073137, prev total WAL file size 6073137, number of live WAL files 2.
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.639138) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(3788KB)], [102(9722KB)]
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653639242, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13835223, "oldest_snapshot_seqno": -1}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7076 keys, 11859042 bytes, temperature: kUnknown
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653741669, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 11859042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11815499, "index_size": 24743, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17733, "raw_key_size": 187038, "raw_average_key_size": 26, "raw_value_size": 11692222, "raw_average_value_size": 1652, "num_data_blocks": 964, "num_entries": 7076, "num_filter_entries": 7076, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.742770) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 11859042 bytes
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.759399) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.0 rd, 115.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 9.5 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(6.6) write-amplify(3.1) OK, records in: 7594, records dropped: 518 output_compression: NoCompression
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.759449) EVENT_LOG_v1 {"time_micros": 1759257653759430, "job": 64, "event": "compaction_finished", "compaction_time_micros": 102497, "compaction_time_cpu_micros": 38082, "output_level": 6, "num_output_files": 1, "total_output_size": 11859042, "num_input_records": 7594, "num_output_records": 7076, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653760828, "job": 64, "event": "table_file_deletion", "file_number": 104}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257653764220, "job": 64, "event": "table_file_deletion", "file_number": 102}
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.638994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.764402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.764413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.764415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.764418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:40:53.764421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:40:54 compute-1 nova_compute[238822]: 2025-09-30 18:40:54.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:40:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/466932660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:54 compute-1 nova_compute[238822]: 2025-09-30 18:40:54.076 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:40:54 compute-1 nova_compute[238822]: 2025-09-30 18:40:54.084 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:40:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:54.413 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:40:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:54.413 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:40:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:40:54.413 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:40:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:54.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:54 compute-1 nova_compute[238822]: 2025-09-30 18:40:54.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:54 compute-1 nova_compute[238822]: 2025-09-30 18:40:54.603 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:40:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/466932660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:55 compute-1 nova_compute[238822]: 2025-09-30 18:40:55.117 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:40:55 compute-1 nova_compute[238822]: 2025-09-30 18:40:55.117 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.786s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:40:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:55.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:55 compute-1 sshd-session[297141]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:40:55 compute-1 sshd-session[297141]: banner exchange: Connection from 110.42.70.108 port 47914: Connection timed out
Sep 30 18:40:55 compute-1 ceph-mon[75484]: pgmap v1873: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 44 op/s
Sep 30 18:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:40:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:56.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:40:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4222444443' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:40:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:57 compute-1 nova_compute[238822]: 2025-09-30 18:40:57.118 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:57 compute-1 nova_compute[238822]: 2025-09-30 18:40:57.119 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:57 compute-1 nova_compute[238822]: 2025-09-30 18:40:57.119 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:57.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:57 compute-1 ceph-mon[75484]: pgmap v1874: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 44 op/s
Sep 30 18:40:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/590762389' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:40:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/590762389' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:40:58 compute-1 nova_compute[238822]: 2025-09-30 18:40:58.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:40:58.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:58 compute-1 ceph-mon[75484]: pgmap v1875: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 24 KiB/s rd, 1.8 MiB/s wr, 36 op/s
Sep 30 18:40:59 compute-1 nova_compute[238822]: 2025-09-30 18:40:59.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:40:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:40:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:40:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:40:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:40:59 compute-1 sudo[297372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:40:59 compute-1 sudo[297372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:40:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:40:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:40:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:40:59.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:40:59 compute-1 sudo[297372]: pam_unix(sudo:session): session closed for user root
Sep 30 18:40:59 compute-1 nova_compute[238822]: 2025-09-30 18:40:59.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:00.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:01 compute-1 nova_compute[238822]: 2025-09-30 18:41:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:41:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:01 compute-1 ceph-mon[75484]: pgmap v1876: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:41:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:02.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:02 compute-1 podman[297402]: 2025-09-30 18:41:02.576432275 +0000 UTC m=+0.098395845 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:41:02 compute-1 podman[297401]: 2025-09-30 18:41:02.606932345 +0000 UTC m=+0.134881856 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:41:03 compute-1 nova_compute[238822]: 2025-09-30 18:41:03.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:03 compute-1 ceph-mon[75484]: pgmap v1877: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:41:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:04 compute-1 nova_compute[238822]: 2025-09-30 18:41:04.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1166076486' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:04.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:04 compute-1 nova_compute[238822]: 2025-09-30 18:41:04.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:05 compute-1 ceph-mon[75484]: pgmap v1878: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:41:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:41:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:05.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:41:05 compute-1 podman[249638]: time="2025-09-30T18:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:41:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:41:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:06.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:07 compute-1 ceph-mon[75484]: pgmap v1879: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 64 op/s
Sep 30 18:41:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:07 compute-1 podman[297457]: 2025-09-30 18:41:07.542250695 +0000 UTC m=+0.081749158 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:41:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:08.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:09 compute-1 nova_compute[238822]: 2025-09-30 18:41:09.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:09 compute-1 ceph-mon[75484]: pgmap v1880: 353 pgs: 353 active+clean; 88 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 64 op/s
Sep 30 18:41:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:09.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:09 compute-1 nova_compute[238822]: 2025-09-30 18:41:09.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:10.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:11 compute-1 ceph-mon[75484]: pgmap v1881: 353 pgs: 353 active+clean; 167 MiB data, 381 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 145 op/s
Sep 30 18:41:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:11.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:41:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:12.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:41:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:13 compute-1 ceph-mon[75484]: pgmap v1882: 353 pgs: 353 active+clean; 167 MiB data, 381 MiB used, 40 GiB / 40 GiB avail; 335 KiB/s rd, 3.9 MiB/s wr, 80 op/s
Sep 30 18:41:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:13.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:14 compute-1 nova_compute[238822]: 2025-09-30 18:41:14.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:14.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:14 compute-1 nova_compute[238822]: 2025-09-30 18:41:14.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:15 compute-1 unix_chkpwd[297487]: password check failed for user (root)
Sep 30 18:41:15 compute-1 sshd-session[297483]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:41:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:15 compute-1 ceph-mon[75484]: pgmap v1883: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:41:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1827812612' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:41:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/772842181' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:41:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:15.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:15 compute-1 podman[297490]: 2025-09-30 18:41:15.533556293 +0000 UTC m=+0.071599965 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd)
Sep 30 18:41:15 compute-1 podman[297489]: 2025-09-30 18:41:15.561813613 +0000 UTC m=+0.095335674 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm)
Sep 30 18:41:15 compute-1 podman[297488]: 2025-09-30 18:41:15.567489326 +0000 UTC m=+0.101719686 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 18:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:16 compute-1 unix_chkpwd[297551]: password check failed for user (root)
Sep 30 18:41:16 compute-1 sshd-session[297548]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.172.43.167  user=root
Sep 30 18:41:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:16.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:17 compute-1 ceph-mon[75484]: pgmap v1884: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:41:17 compute-1 sshd-session[297483]: Failed password for root from 192.210.160.141 port 40120 ssh2
Sep 30 18:41:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:18 compute-1 sshd-session[297483]: Connection closed by authenticating user root 192.210.160.141 port 40120 [preauth]
Sep 30 18:41:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:18.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:18 compute-1 sshd-session[297548]: Failed password for root from 167.172.43.167 port 34702 ssh2
Sep 30 18:41:19 compute-1 nova_compute[238822]: 2025-09-30 18:41:19.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:19 compute-1 sshd-session[297548]: Received disconnect from 167.172.43.167 port 34702:11: Bye Bye [preauth]
Sep 30 18:41:19 compute-1 sshd-session[297548]: Disconnected from authenticating user root 167.172.43.167 port 34702 [preauth]
Sep 30 18:41:19 compute-1 ceph-mon[75484]: pgmap v1885: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:41:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:41:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:19.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: ERROR   18:41:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: ERROR   18:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: ERROR   18:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: ERROR   18:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: ERROR   18:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:41:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:41:19 compute-1 sudo[297555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:41:19 compute-1 sudo[297555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:19 compute-1 sudo[297555]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:19 compute-1 nova_compute[238822]: 2025-09-30 18:41:19.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:20.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:21 compute-1 ceph-mon[75484]: pgmap v1886: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:41:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:41:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:41:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:22.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:23 compute-1 ceph-mon[75484]: pgmap v1887: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 9.2 KiB/s rd, 12 KiB/s wr, 11 op/s
Sep 30 18:41:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:24 compute-1 nova_compute[238822]: 2025-09-30 18:41:24.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:24.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:24 compute-1 nova_compute[238822]: 2025-09-30 18:41:24.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:25 compute-1 ceph-mon[75484]: pgmap v1888: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 85 op/s
Sep 30 18:41:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:25.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:25 compute-1 unix_chkpwd[297588]: password check failed for user (root)
Sep 30 18:41:25 compute-1 sshd-session[297586]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:26.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:27 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:41:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:27 compute-1 ceph-mon[75484]: pgmap v1889: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:41:27 compute-1 sshd-session[297586]: Failed password for root from 8.243.64.201 port 51596 ssh2
Sep 30 18:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:28 compute-1 sshd-session[297586]: Received disconnect from 8.243.64.201 port 51596:11: Bye Bye [preauth]
Sep 30 18:41:28 compute-1 sshd-session[297586]: Disconnected from authenticating user root 8.243.64.201 port 51596 [preauth]
Sep 30 18:41:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:28.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:29 compute-1 nova_compute[238822]: 2025-09-30 18:41:29.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:29 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:41:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:29 compute-1 ceph-mon[75484]: pgmap v1890: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:41:29 compute-1 nova_compute[238822]: 2025-09-30 18:41:29.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:30.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:31.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:31 compute-1 ceph-mon[75484]: pgmap v1891: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:32.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:33.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:33 compute-1 ceph-mon[75484]: pgmap v1892: 353 pgs: 353 active+clean; 167 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:41:33 compute-1 podman[297600]: 2025-09-30 18:41:33.572702993 +0000 UTC m=+0.092250270 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:41:33 compute-1 podman[297599]: 2025-09-30 18:41:33.622772488 +0000 UTC m=+0.148897892 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:41:34 compute-1 nova_compute[238822]: 2025-09-30 18:41:34.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:34.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:34 compute-1 nova_compute[238822]: 2025-09-30 18:41:34.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:35.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:35 compute-1 ceph-mon[75484]: pgmap v1893: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 138 op/s
Sep 30 18:41:35 compute-1 podman[249638]: time="2025-09-30T18:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:41:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:41:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8365 "" "Go-http-client/1.1"
Sep 30 18:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:36 compute-1 nova_compute[238822]: 2025-09-30 18:41:36.183 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Creating tmpfile /var/lib/nova/instances/tmphvjyhh8m to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:41:36 compute-1 nova_compute[238822]: 2025-09-30 18:41:36.184 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:36 compute-1 nova_compute[238822]: 2025-09-30 18:41:36.191 2 DEBUG nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvjyhh8m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:41:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:41:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/546199454' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:41:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:41:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/546199454' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:41:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/546199454' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:41:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/546199454' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:41:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:36.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:37.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:37 compute-1 ceph-mon[75484]: pgmap v1894: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:41:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:38 compute-1 nova_compute[238822]: 2025-09-30 18:41:38.234 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:38 compute-1 podman[297653]: 2025-09-30 18:41:38.549185188 +0000 UTC m=+0.077485384 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:41:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:38.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:39 compute-1 nova_compute[238822]: 2025-09-30 18:41:39.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:39.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:39 compute-1 sudo[297675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:41:39 compute-1 sudo[297675]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:39 compute-1 sudo[297675]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:39 compute-1 ceph-mon[75484]: pgmap v1895: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:41:39 compute-1 nova_compute[238822]: 2025-09-30 18:41:39.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:40.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:41 compute-1 unix_chkpwd[297703]: password check failed for user (root)
Sep 30 18:41:41 compute-1 sshd-session[297674]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:41:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:41.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:41 compute-1 ceph-mon[75484]: pgmap v1896: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:42 compute-1 nova_compute[238822]: 2025-09-30 18:41:42.211 2 DEBUG nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvjyhh8m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='86b9b1e5-516e-43c2-b180-7ef40f7c1c67',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:41:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:42.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:43 compute-1 nova_compute[238822]: 2025-09-30 18:41:43.230 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:41:43 compute-1 nova_compute[238822]: 2025-09-30 18:41:43.231 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:41:43 compute-1 nova_compute[238822]: 2025-09-30 18:41:43.231 2 DEBUG nova.network.neutron [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:41:43 compute-1 sshd-session[297674]: Failed password for root from 192.210.160.141 port 44700 ssh2
Sep 30 18:41:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:43.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:43 compute-1 ceph-mon[75484]: pgmap v1897: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:41:43 compute-1 nova_compute[238822]: 2025-09-30 18:41:43.739 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:44 compute-1 nova_compute[238822]: 2025-09-30 18:41:44.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:44 compute-1 sshd-session[297674]: Connection closed by authenticating user root 192.210.160.141 port 44700 [preauth]
Sep 30 18:41:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:44.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:44 compute-1 nova_compute[238822]: 2025-09-30 18:41:44.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:45.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:45 compute-1 ceph-mon[75484]: pgmap v1898: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Sep 30 18:41:45 compute-1 nova_compute[238822]: 2025-09-30 18:41:45.687 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:45 compute-1 nova_compute[238822]: 2025-09-30 18:41:45.931 2 DEBUG nova.network.neutron [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Updating instance_info_cache with network_info: [{"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.440 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.455 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvjyhh8m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='86b9b1e5-516e-43c2-b180-7ef40f7c1c67',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.456 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Creating instance directory: /var/lib/nova/instances/86b9b1e5-516e-43c2-b180-7ef40f7c1c67 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.457 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Ensure instance console log exists: /var/lib/nova/instances/86b9b1e5-516e-43c2-b180-7ef40f7c1c67/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.457 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.459 2 DEBUG nova.virt.libvirt.vif [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:40:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-771828615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-771828615',id=30,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:40:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63c45bef63ef4b9f895b3bab865e1a84',ramdisk_id='',reservation_id='r-0wzw0y3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:40:55Z,user_data=None,user_id='5717e8cb8548429b948a23763350ab4a',uuid=86b9b1e5-516e-43c2-b180-7ef40f7c1c67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.459 2 DEBUG nova.network.os_vif_util [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.460 2 DEBUG nova.network.os_vif_util [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.461 2 DEBUG os_vif [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '924e3df8-7c4c-514f-9ad6-9490097638ec', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.473 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8821620-97, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc8821620-97, col_values=(('qos', UUID('a1db3761-b95b-4e24-8d21-62664fdff708')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc8821620-97, col_values=(('external_ids', {'iface-id': 'c8821620-973c-4db8-9c4b-766e7751348e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:7f:fe', 'vm-uuid': '86b9b1e5-516e-43c2-b180-7ef40f7c1c67'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 NetworkManager[45549]: <info>  [1759257706.4778] manager: (tapc8821620-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.490 2 INFO os_vif [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97')
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.491 2 DEBUG nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.491 2 DEBUG nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvjyhh8m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='86b9b1e5-516e-43c2-b180-7ef40f7c1c67',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.492 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:46 compute-1 podman[297711]: 2025-09-30 18:41:46.573793044 +0000 UTC m=+0.099136095 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:41:46 compute-1 podman[297710]: 2025-09-30 18:41:46.580514745 +0000 UTC m=+0.108449126 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Sep 30 18:41:46 compute-1 podman[297709]: 2025-09-30 18:41:46.594772028 +0000 UTC m=+0.129388289 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Sep 30 18:41:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:46.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:46 compute-1 ceph-mon[75484]: pgmap v1899: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:41:46 compute-1 nova_compute[238822]: 2025-09-30 18:41:46.757 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:47.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:47.484 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:41:47 compute-1 nova_compute[238822]: 2025-09-30 18:41:47.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:47.486 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:41:47 compute-1 nova_compute[238822]: 2025-09-30 18:41:47.914 2 DEBUG nova.network.neutron [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Port c8821620-973c-4db8-9c4b-766e7751348e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:41:47 compute-1 nova_compute[238822]: 2025-09-30 18:41:47.930 2 DEBUG nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvjyhh8m',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='86b9b1e5-516e-43c2-b180-7ef40f7c1c67',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:48.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:49 compute-1 ceph-mon[75484]: pgmap v1900: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: ERROR   18:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: ERROR   18:41:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: ERROR   18:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: ERROR   18:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: ERROR   18:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:41:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:41:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:41:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:49.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:41:49 compute-1 nova_compute[238822]: 2025-09-30 18:41:49.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:50 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:41:50 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:41:50 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:41:50 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:41:50 compute-1 kernel: tapc8821620-97: entered promiscuous mode
Sep 30 18:41:50 compute-1 nova_compute[238822]: 2025-09-30 18:41:50.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:50 compute-1 NetworkManager[45549]: <info>  [1759257710.5225] manager: (tapc8821620-97): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Sep 30 18:41:50 compute-1 ovn_controller[135204]: 2025-09-30T18:41:50Z|00258|binding|INFO|Claiming lport c8821620-973c-4db8-9c4b-766e7751348e for this additional chassis.
Sep 30 18:41:50 compute-1 ovn_controller[135204]: 2025-09-30T18:41:50Z|00259|binding|INFO|c8821620-973c-4db8-9c4b-766e7751348e: Claiming fa:16:3e:49:7f:fe 10.100.0.8
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.546 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:7f:fe 10.100.0.8'], port_security=['fa:16:3e:49:7f:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '86b9b1e5-516e-43c2-b180-7ef40f7c1c67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a9025550-4c18-4f21-a560-5b6f52684803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55e305e6-0f4d-40bc-a70b-ac91f882ec57, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c8821620-973c-4db8-9c4b-766e7751348e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.547 144543 INFO neutron.agent.ovn.metadata.agent [-] Port c8821620-973c-4db8-9c4b-766e7751348e in datapath c8484b9b-b34e-4c32-b987-029d8fcb2a28 unbound from our chassis
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.550 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.573 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[47cb56ed-01c0-472d-8f86-8eb672fc980d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.574 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8484b9b-b1 in ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.576 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8484b9b-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.576 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c990c1d3-8843-42a8-8762-60f84a58d88a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.577 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0055c234-f3cb-4c62-b5a7-3dee96542769]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 systemd-machined[195911]: New machine qemu-24-instance-0000001e.
Sep 30 18:41:50 compute-1 systemd-udevd[297825]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.605 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[2ad31fcb-03a3-4f57-a827-38538febe5f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 NetworkManager[45549]: <info>  [1759257710.6108] device (tapc8821620-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:41:50 compute-1 NetworkManager[45549]: <info>  [1759257710.6129] device (tapc8821620-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:41:50 compute-1 nova_compute[238822]: 2025-09-30 18:41:50.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:50 compute-1 ovn_controller[135204]: 2025-09-30T18:41:50Z|00260|binding|INFO|Setting lport c8821620-973c-4db8-9c4b-766e7751348e ovn-installed in OVS
Sep 30 18:41:50 compute-1 nova_compute[238822]: 2025-09-30 18:41:50.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:50 compute-1 systemd[1]: Started Virtual Machine qemu-24-instance-0000001e.
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.629 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a1be46d2-e056-4d6e-addd-432530077005]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:50.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.665 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca5da27-57c5-4e52-8bb6-4e54c23859a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.674 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9fe5d3-3b66-4f6e-a45b-8dff5b8e13a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 NetworkManager[45549]: <info>  [1759257710.6767] manager: (tapc8484b9b-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/97)
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.723 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a260ec0a-0a49-47fc-91c2-609ba0c4e1da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.726 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d7373b-3bfe-4d4e-8692-1693534671d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 NetworkManager[45549]: <info>  [1759257710.7616] device (tapc8484b9b-b0): carrier: link connected
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.770 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[fd656697-77a7-437d-b949-a294418fb8f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.797 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1b9f3b-98d4-4d54-8dad-0cb2e10dc45c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8484b9b-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:bc:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1555234, 'reachable_time': 29183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297857, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.823 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc5feb0-0899-4d5e-9413-5cecf158afb4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:bccf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1555234, 'tstamp': 1555234}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297858, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.851 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[92e8a1c0-bd4b-413a-b499-4b20284359d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8484b9b-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:bc:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1555234, 'reachable_time': 29183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297859, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:50.896 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2a635f33-fb32-45ad-990f-c6a404f04892]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.000 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[50348bcb-ecdd-4438-bcd9-cdfabe5f3df9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.002 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8484b9b-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.003 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.004 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8484b9b-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:51 compute-1 kernel: tapc8484b9b-b0: entered promiscuous mode
Sep 30 18:41:51 compute-1 NetworkManager[45549]: <info>  [1759257711.0087] manager: (tapc8484b9b-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.010 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8484b9b-b0, col_values=(('external_ids', {'iface-id': 'd2e69f29-6b3a-46dc-9ed7-12031e1b7d2b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:51 compute-1 ovn_controller[135204]: 2025-09-30T18:41:51Z|00261|binding|INFO|Releasing lport d2e69f29-6b3a-46dc-9ed7-12031e1b7d2b from this chassis (sb_readonly=0)
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.015 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3931fa-2484-46c1-b2c0-6a7a95d82820]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.016 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.017 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.017 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c8484b9b-b34e-4c32-b987-029d8fcb2a28 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.017 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.018 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[06e9f414-adcc-4d80-bbb6-20d08ff8b640]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.018 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.019 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[12c151ce-de2e-4e02-8bfe-da5a4702c59e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.020 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:41:51 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:51.021 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'env', 'PROCESS_TAG=haproxy-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8484b9b-b34e-4c32-b987-029d8fcb2a28.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.052 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:41:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:51 compute-1 ceph-mon[75484]: pgmap v1901: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:41:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:51.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:51 compute-1 sudo[297887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:41:51 compute-1 sudo[297887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:51 compute-1 sudo[297887]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:51 compute-1 nova_compute[238822]: 2025-09-30 18:41:51.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:51 compute-1 podman[297896]: 2025-09-30 18:41:51.488298883 +0000 UTC m=+0.096267817 container create 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:41:51 compute-1 podman[297896]: 2025-09-30 18:41:51.441676341 +0000 UTC m=+0.049645335 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:41:51 compute-1 systemd[1]: Started libpod-conmon-8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1.scope.
Sep 30 18:41:51 compute-1 sudo[297928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:41:51 compute-1 sudo[297928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:51 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:41:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08274a2c7f50c3a0047d6e7a3cacf75c407b709a0a699f83ba356d8cb0ebe3a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:41:51 compute-1 podman[297896]: 2025-09-30 18:41:51.62173963 +0000 UTC m=+0.229708614 container init 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:41:51 compute-1 podman[297896]: 2025-09-30 18:41:51.631693617 +0000 UTC m=+0.239662551 container start 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:41:51 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [NOTICE]   (297962) : New worker (297964) forked
Sep 30 18:41:51 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [NOTICE]   (297962) : Loading success.
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.056 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:52 compute-1 podman[298053]: 2025-09-30 18:41:52.320966582 +0000 UTC m=+0.101905420 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Sep 30 18:41:52 compute-1 podman[298053]: 2025-09-30 18:41:52.448961882 +0000 UTC m=+0.229900630 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:41:52 compute-1 nova_compute[238822]: 2025-09-30 18:41:52.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:41:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:52.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:41:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2392918740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:53 compute-1 nova_compute[238822]: 2025-09-30 18:41:53.057 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:41:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:53 compute-1 podman[298227]: 2025-09-30 18:41:53.17332415 +0000 UTC m=+0.072579852 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:41:53 compute-1 podman[298227]: 2025-09-30 18:41:53.185890247 +0000 UTC m=+0.085145919 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:41:53 compute-1 ceph-mon[75484]: pgmap v1902: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:41:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2392918740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:53 compute-1 podman[298362]: 2025-09-30 18:41:53.985755694 +0000 UTC m=+0.093343399 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:41:54 compute-1 podman[298362]: 2025-09-30 18:41:54.004068056 +0000 UTC m=+0.111655701 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.109 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.109 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:41:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2657525076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:54 compute-1 podman[298428]: 2025-09-30 18:41:54.344997139 +0000 UTC m=+0.098997111 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, version=2.2.4, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived)
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.351 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.353 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:41:54 compute-1 podman[298428]: 2025-09-30 18:41:54.364039431 +0000 UTC m=+0.118039343 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, version=2.2.4)
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.392 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.393 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4499MB free_disk=39.90116500854492GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.393 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.394 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:41:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:54.414 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:41:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:54.415 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:41:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:54.415 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:41:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:54.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:54 compute-1 sudo[297928]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:54 compute-1 nova_compute[238822]: 2025-09-30 18:41:54.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:54 compute-1 sudo[298500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:41:54 compute-1 sudo[298500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:54 compute-1 sudo[298500]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:54 compute-1 sudo[298525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:41:54 compute-1 sudo[298525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:55 compute-1 ceph-mon[75484]: pgmap v1903: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 9.2 KiB/s wr, 7 op/s
Sep 30 18:41:55 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:55 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:55 compute-1 nova_compute[238822]: 2025-09-30 18:41:55.418 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 86b9b1e5-516e-43c2-b180-7ef40f7c1c67 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:41:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:41:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:55.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:41:55 compute-1 sudo[298525]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:55 compute-1 sudo[298584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:41:55 compute-1 sudo[298584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:55 compute-1 nova_compute[238822]: 2025-09-30 18:41:55.935 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Updating resource usage from migration f282835a-0309-42be-a72c-7d2d96cd2df8
Sep 30 18:41:55 compute-1 nova_compute[238822]: 2025-09-30 18:41:55.935 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Starting to track incoming migration f282835a-0309-42be-a72c-7d2d96cd2df8 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:41:55 compute-1 sudo[298584]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:55 compute-1 ovn_controller[135204]: 2025-09-30T18:41:55Z|00262|binding|INFO|Claiming lport c8821620-973c-4db8-9c4b-766e7751348e for this chassis.
Sep 30 18:41:55 compute-1 ovn_controller[135204]: 2025-09-30T18:41:55Z|00263|binding|INFO|c8821620-973c-4db8-9c4b-766e7751348e: Claiming fa:16:3e:49:7f:fe 10.100.0.8
Sep 30 18:41:55 compute-1 ovn_controller[135204]: 2025-09-30T18:41:55Z|00264|binding|INFO|Setting lport c8821620-973c-4db8-9c4b-766e7751348e up in Southbound
Sep 30 18:41:56 compute-1 sudo[298609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Sep 30 18:41:56 compute-1 sudo[298609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2329631877' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:56 compute-1 sudo[298609]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:56 compute-1 nova_compute[238822]: 2025-09-30 18:41:56.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:41:56 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:41:56.488 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:41:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:56.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:56 compute-1 nova_compute[238822]: 2025-09-30 18:41:56.998 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 86b9b1e5-516e-43c2-b180-7ef40f7c1c67 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:41:56 compute-1 nova_compute[238822]: 2025-09-30 18:41:56.998 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:41:56 compute-1 nova_compute[238822]: 2025-09-30 18:41:56.999 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:41:54 up  4:19,  0 user,  load average: 0.72, 0.54, 0.57\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:41:57 compute-1 nova_compute[238822]: 2025-09-30 18:41:57.044 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:41:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:57 compute-1 ceph-mon[75484]: pgmap v1904: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 9.2 KiB/s wr, 6 op/s
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:41:57 compute-1 ceph-mon[75484]: pgmap v1905: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.5 KiB/s rd, 11 KiB/s wr, 8 op/s
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:41:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:41:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:57.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:41:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/134415393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:57 compute-1 nova_compute[238822]: 2025-09-30 18:41:57.535 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:41:57 compute-1 nova_compute[238822]: 2025-09-30 18:41:57.547 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:41:58 compute-1 nova_compute[238822]: 2025-09-30 18:41:58.063 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:58 compute-1 nova_compute[238822]: 2025-09-30 18:41:58.277 2 INFO nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Post operation of migration started
Sep 30 18:41:58 compute-1 nova_compute[238822]: 2025-09-30 18:41:58.278 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/134415393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:41:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2727541640' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:41:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2727541640' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:41:58 compute-1 nova_compute[238822]: 2025-09-30 18:41:58.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:41:58 compute-1 nova_compute[238822]: 2025-09-30 18:41:58.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:41:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:41:58.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.012 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.013 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:41:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:41:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.105 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.105 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.106 2 DEBUG nova.network.neutron [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:41:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:41:59 compute-1 ceph-mon[75484]: pgmap v1906: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 6.0 KiB/s rd, 11 KiB/s wr, 8 op/s
Sep 30 18:41:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:41:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:41:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:41:59.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.612 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:41:59 compute-1 sudo[298680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:41:59 compute-1 sudo[298680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:41:59 compute-1 sudo[298680]: pam_unix(sudo:session): session closed for user root
Sep 30 18:41:59 compute-1 nova_compute[238822]: 2025-09-30 18:41:59.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:00 compute-1 nova_compute[238822]: 2025-09-30 18:42:00.577 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:00 compute-1 nova_compute[238822]: 2025-09-30 18:42:00.578 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:00 compute-1 nova_compute[238822]: 2025-09-30 18:42:00.578 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:00 compute-1 nova_compute[238822]: 2025-09-30 18:42:00.579 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:00.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:01 compute-1 sudo[298708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:42:01 compute-1 sudo[298708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:42:01 compute-1 sudo[298708]: pam_unix(sudo:session): session closed for user root
Sep 30 18:42:01 compute-1 ceph-mon[75484]: pgmap v1907: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.7 KiB/s rd, 11 KiB/s wr, 8 op/s
Sep 30 18:42:01 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:42:01 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:42:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:01.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:01 compute-1 nova_compute[238822]: 2025-09-30 18:42:01.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:01 compute-1 nova_compute[238822]: 2025-09-30 18:42:01.825 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:02.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:02 compute-1 nova_compute[238822]: 2025-09-30 18:42:02.801 2 DEBUG nova.network.neutron [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Updating instance_info_cache with network_info: [{"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.311 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-86b9b1e5-516e-43c2-b180-7ef40f7c1c67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:42:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:03.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:03 compute-1 ceph-mon[75484]: pgmap v1908: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.7 KiB/s rd, 11 KiB/s wr, 8 op/s
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.834 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.834 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.835 2 DEBUG oslo_concurrency.lockutils [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:03 compute-1 nova_compute[238822]: 2025-09-30 18:42:03.841 2 INFO nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:42:03 compute-1 virtqemud[239124]: Domain id=24 name='instance-0000001e' uuid=86b9b1e5-516e-43c2-b180-7ef40f7c1c67 is tainted: custom-monitor
Sep 30 18:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:04 compute-1 podman[298737]: 2025-09-30 18:42:04.577116381 +0000 UTC m=+0.106678918 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:42:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:04.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:04 compute-1 podman[298736]: 2025-09-30 18:42:04.667941942 +0000 UTC m=+0.197347655 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Sep 30 18:42:04 compute-1 nova_compute[238822]: 2025-09-30 18:42:04.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:04 compute-1 nova_compute[238822]: 2025-09-30 18:42:04.852 2 INFO nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:42:05 compute-1 nova_compute[238822]: 2025-09-30 18:42:05.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:05.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:05 compute-1 ceph-mon[75484]: pgmap v1909: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 796 B/s rd, 2.7 KiB/s wr, 1 op/s
Sep 30 18:42:05 compute-1 podman[249638]: time="2025-09-30T18:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:42:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:42:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8823 "" "Go-http-client/1.1"
Sep 30 18:42:05 compute-1 nova_compute[238822]: 2025-09-30 18:42:05.860 2 INFO nova.virt.libvirt.driver [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:42:05 compute-1 nova_compute[238822]: 2025-09-30 18:42:05.867 2 DEBUG nova.compute.manager [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:06 compute-1 nova_compute[238822]: 2025-09-30 18:42:06.384 2 DEBUG nova.objects.instance [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:42:06 compute-1 nova_compute[238822]: 2025-09-30 18:42:06.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:06.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:07 compute-1 sshd-session[298789]: Invalid user ubuntu from 192.210.160.141 port 60804
Sep 30 18:42:07 compute-1 sshd-session[298789]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:42:07 compute-1 sshd-session[298789]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:42:07 compute-1 nova_compute[238822]: 2025-09-30 18:42:07.409 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:07.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:07 compute-1 ceph-mon[75484]: pgmap v1910: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 796 B/s rd, 2.7 KiB/s wr, 1 op/s
Sep 30 18:42:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:42:07 compute-1 nova_compute[238822]: 2025-09-30 18:42:07.543 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:07 compute-1 nova_compute[238822]: 2025-09-30 18:42:07.544 2 WARNING neutronclient.v2_0.client [None req-17a6f3dc-42be-4ce6-a356-87d256030da9 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:08.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:09 compute-1 sshd-session[298789]: Failed password for invalid user ubuntu from 192.210.160.141 port 60804 ssh2
Sep 30 18:42:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:09.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:09 compute-1 ceph-mon[75484]: pgmap v1911: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:42:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2338394468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:09 compute-1 podman[298795]: 2025-09-30 18:42:09.559157794 +0000 UTC m=+0.088275843 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:42:09 compute-1 nova_compute[238822]: 2025-09-30 18:42:09.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:10.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:11 compute-1 sshd-session[298789]: Connection closed by invalid user ubuntu 192.210.160.141 port 60804 [preauth]
Sep 30 18:42:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:11.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:11 compute-1 nova_compute[238822]: 2025-09-30 18:42:11.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:11 compute-1 ceph-mon[75484]: pgmap v1912: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:42:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:12.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:42:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:13.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:13 compute-1 ceph-mon[75484]: pgmap v1913: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:42:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3676218880' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:14.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:14 compute-1 nova_compute[238822]: 2025-09-30 18:42:14.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:42:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:15.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:42:15 compute-1 ceph-mon[75484]: pgmap v1914: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:16 compute-1 nova_compute[238822]: 2025-09-30 18:42:16.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:16.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:17.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:17 compute-1 podman[298823]: 2025-09-30 18:42:17.550971098 +0000 UTC m=+0.087474102 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 18:42:17 compute-1 podman[298825]: 2025-09-30 18:42:17.576435262 +0000 UTC m=+0.094301906 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Sep 30 18:42:17 compute-1 ceph-mon[75484]: pgmap v1915: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:42:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1109886704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:17 compute-1 podman[298824]: 2025-09-30 18:42:17.587743966 +0000 UTC m=+0.115369712 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 18:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:18.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: ERROR   18:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: ERROR   18:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: ERROR   18:42:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: ERROR   18:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: ERROR   18:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:42:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:42:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:19.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:19 compute-1 ceph-mon[75484]: pgmap v1916: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:42:19 compute-1 sudo[298885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:42:19 compute-1 sudo[298885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:42:19 compute-1 sudo[298885]: pam_unix(sudo:session): session closed for user root
Sep 30 18:42:19 compute-1 nova_compute[238822]: 2025-09-30 18:42:19.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.180 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.181 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.182 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.182 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.182 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.267 2 INFO nova.compute.manager [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Terminating instance
Sep 30 18:42:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:20.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.791 2 DEBUG nova.compute.manager [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:42:20 compute-1 kernel: tapc8821620-97 (unregistering): left promiscuous mode
Sep 30 18:42:20 compute-1 NetworkManager[45549]: <info>  [1759257740.8518] device (tapc8821620-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:42:20 compute-1 ovn_controller[135204]: 2025-09-30T18:42:20Z|00265|binding|INFO|Releasing lport c8821620-973c-4db8-9c4b-766e7751348e from this chassis (sb_readonly=0)
Sep 30 18:42:20 compute-1 ovn_controller[135204]: 2025-09-30T18:42:20Z|00266|binding|INFO|Setting lport c8821620-973c-4db8-9c4b-766e7751348e down in Southbound
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:20 compute-1 ovn_controller[135204]: 2025-09-30T18:42:20Z|00267|binding|INFO|Removing iface tapc8821620-97 ovn-installed in OVS
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:20.883 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:7f:fe 10.100.0.8'], port_security=['fa:16:3e:49:7f:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '86b9b1e5-516e-43c2-b180-7ef40f7c1c67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'a9025550-4c18-4f21-a560-5b6f52684803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55e305e6-0f4d-40bc-a70b-ac91f882ec57, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=c8821620-973c-4db8-9c4b-766e7751348e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:42:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:20.884 144543 INFO neutron.agent.ovn.metadata.agent [-] Port c8821620-973c-4db8-9c4b-766e7751348e in datapath c8484b9b-b34e-4c32-b987-029d8fcb2a28 unbound from our chassis
Sep 30 18:42:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:20.886 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8484b9b-b34e-4c32-b987-029d8fcb2a28, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:42:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:20.888 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[98f71a1c-608c-48cb-83f1-80d2d09a4270]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:20.888 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 namespace which is not needed anymore
Sep 30 18:42:20 compute-1 nova_compute[238822]: 2025-09-30 18:42:20.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:20 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Sep 30 18:42:20 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Consumed 4.179s CPU time.
Sep 30 18:42:20 compute-1 systemd-machined[195911]: Machine qemu-24-instance-0000001e terminated.
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.034 2 INFO nova.virt.libvirt.driver [-] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Instance destroyed successfully.
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.037 2 DEBUG nova.objects.instance [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lazy-loading 'resources' on Instance uuid 86b9b1e5-516e-43c2-b180-7ef40f7c1c67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.039 2 DEBUG nova.compute.manager [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Received event network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.039 2 DEBUG oslo_concurrency.lockutils [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.040 2 DEBUG oslo_concurrency.lockutils [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.040 2 DEBUG oslo_concurrency.lockutils [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.040 2 DEBUG nova.compute.manager [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] No waiting events found dispatching network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.040 2 DEBUG nova.compute.manager [req-60816b92-8309-47e4-984a-423478eecdf4 req-20731d42-981b-438a-b059-95697a03c9ba 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Received event network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:42:21 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [NOTICE]   (297962) : haproxy version is 3.0.5-8e879a5
Sep 30 18:42:21 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [NOTICE]   (297962) : path to executable is /usr/sbin/haproxy
Sep 30 18:42:21 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [WARNING]  (297962) : Exiting Master process...
Sep 30 18:42:21 compute-1 podman[298939]: 2025-09-30 18:42:21.063831837 +0000 UTC m=+0.039468792 container kill 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 18:42:21 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [ALERT]    (297962) : Current worker (297964) exited with code 143 (Terminated)
Sep 30 18:42:21 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[297957]: [WARNING]  (297962) : All workers exited. Exiting... (0)
Sep 30 18:42:21 compute-1 systemd[1]: libpod-8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1.scope: Deactivated successfully.
Sep 30 18:42:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:21 compute-1 podman[298966]: 2025-09-30 18:42:21.129085031 +0000 UTC m=+0.030993624 container died 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Sep 30 18:42:21 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1-userdata-shm.mount: Deactivated successfully.
Sep 30 18:42:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-08274a2c7f50c3a0047d6e7a3cacf75c407b709a0a699f83ba356d8cb0ebe3a9-merged.mount: Deactivated successfully.
Sep 30 18:42:21 compute-1 podman[298966]: 2025-09-30 18:42:21.207995082 +0000 UTC m=+0.109903685 container remove 8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:42:21 compute-1 systemd[1]: libpod-conmon-8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1.scope: Deactivated successfully.
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.220 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[96fbf3b5-2180-4c0a-9286-b913e9ef1eb0]: (4, ("Tue Sep 30 06:42:21 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 (8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1)\n8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1\nTue Sep 30 06:42:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 (8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1)\n8ff8a2b426f512a610cc8fcd80117e9da423835296350fa2140a4e490d87c2d1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.224 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b75bda54-ecd7-4311-ac0d-0b528fbbd71b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.224 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.225 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a73581d4-8b4e-410c-a464-99a483e0e048]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.226 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8484b9b-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 kernel: tapc8484b9b-b0: left promiscuous mode
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.264 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1926f409-35d0-406e-b22a-b16d07950a20]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.296 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ef59a5f1-118d-4766-b197-850d0b679021]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.297 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[72cdb271-5f35-4f1e-927c-49fe8ef9311f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.317 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[76a66ad0-36cd-4541-9270-690fcb1cfe58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1555223, 'reachable_time': 35736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298996, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.320 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:42:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:21.320 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[4300c483-7902-4238-9285-c0dc1b5ded07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:42:21 compute-1 systemd[1]: run-netns-ovnmeta\x2dc8484b9b\x2db34e\x2d4c32\x2db987\x2d029d8fcb2a28.mount: Deactivated successfully.
Sep 30 18:42:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:21.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.547 2 DEBUG nova.virt.libvirt.vif [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:40:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-771828615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-771828615',id=30,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:40:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63c45bef63ef4b9f895b3bab865e1a84',ramdisk_id='',reservation_id='r-0wzw0y3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:42:06Z,user_data=None,user_id='5717e8cb8548429b948a23763350ab4a',uuid=86b9b1e5-516e-43c2-b180-7ef40f7c1c67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.547 2 DEBUG nova.network.os_vif_util [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Converting VIF {"id": "c8821620-973c-4db8-9c4b-766e7751348e", "address": "fa:16:3e:49:7f:fe", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8821620-97", "ovs_interfaceid": "c8821620-973c-4db8-9c4b-766e7751348e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.548 2 DEBUG nova.network.os_vif_util [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.549 2 DEBUG os_vif [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.552 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8821620-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a1db3761-b95b-4e24-8d21-62664fdff708) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:21 compute-1 nova_compute[238822]: 2025-09-30 18:42:21.563 2 INFO os_vif [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:7f:fe,bridge_name='br-int',has_traffic_filtering=True,id=c8821620-973c-4db8-9c4b-766e7751348e,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8821620-97')
Sep 30 18:42:21 compute-1 ceph-mon[75484]: pgmap v1917: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.040 2 INFO nova.virt.libvirt.driver [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Deleting instance files /var/lib/nova/instances/86b9b1e5-516e-43c2-b180-7ef40f7c1c67_del
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.041 2 INFO nova.virt.libvirt.driver [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Deletion of /var/lib/nova/instances/86b9b1e5-516e-43c2-b180-7ef40f7c1c67_del complete
Sep 30 18:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.557 2 INFO nova.compute.manager [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Took 1.76 seconds to destroy the instance on the hypervisor.
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.557 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.558 2 DEBUG nova.compute.manager [-] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.558 2 DEBUG nova.network.neutron [-] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.558 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:42:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:22.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:22 compute-1 nova_compute[238822]: 2025-09-30 18:42:22.712 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:42:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.086 2 DEBUG nova.compute.manager [req-7543ff6c-7107-47e1-bfff-ee87788e8498 req-d6956dd7-8149-4b79-b082-5ac98ea510fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Received event network-vif-deleted-c8821620-973c-4db8-9c4b-766e7751348e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.086 2 INFO nova.compute.manager [req-7543ff6c-7107-47e1-bfff-ee87788e8498 req-d6956dd7-8149-4b79-b082-5ac98ea510fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Neutron deleted interface c8821620-973c-4db8-9c4b-766e7751348e; detaching it from the instance and deleting it from the info cache
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.087 2 DEBUG nova.network.neutron [req-7543ff6c-7107-47e1-bfff-ee87788e8498 req-d6956dd7-8149-4b79-b082-5ac98ea510fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:42:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.109 2 DEBUG nova.compute.manager [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Received event network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.110 2 DEBUG oslo_concurrency.lockutils [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.110 2 DEBUG oslo_concurrency.lockutils [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.111 2 DEBUG oslo_concurrency.lockutils [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.111 2 DEBUG nova.compute.manager [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] No waiting events found dispatching network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.111 2 DEBUG nova.compute.manager [req-b85e69e2-ad7b-4327-a34c-cf95f00672ca req-41e51898-8556-46e4-94aa-61ca918bb4bf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Received event network-vif-unplugged-c8821620-973c-4db8-9c4b-766e7751348e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:42:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:23.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.523 2 DEBUG nova.network.neutron [-] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:42:23 compute-1 nova_compute[238822]: 2025-09-30 18:42:23.596 2 DEBUG nova.compute.manager [req-7543ff6c-7107-47e1-bfff-ee87788e8498 req-d6956dd7-8149-4b79-b082-5ac98ea510fb 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Detach interface failed, port_id=c8821620-973c-4db8-9c4b-766e7751348e, reason: Instance 86b9b1e5-516e-43c2-b180-7ef40f7c1c67 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:42:23 compute-1 ceph-mon[75484]: pgmap v1918: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.032 2 INFO nova.compute.manager [-] [instance: 86b9b1e5-516e-43c2-b180-7ef40f7c1c67] Took 1.47 seconds to deallocate network for instance.
Sep 30 18:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.558 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.559 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.565 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.598 2 INFO nova.scheduler.client.report [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Deleted allocations for instance 86b9b1e5-516e-43c2-b180-7ef40f7c1c67
Sep 30 18:42:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:24.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:24 compute-1 ceph-mon[75484]: pgmap v1919: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:42:24 compute-1 nova_compute[238822]: 2025-09-30 18:42:24.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:25.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:25 compute-1 nova_compute[238822]: 2025-09-30 18:42:25.633 2 DEBUG oslo_concurrency.lockutils [None req-05b68802-97c4-4483-8de8-14f7bc361c22 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "86b9b1e5-516e-43c2-b180-7ef40f7c1c67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.452s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:26 compute-1 nova_compute[238822]: 2025-09-30 18:42:26.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:27 compute-1 ceph-mon[75484]: pgmap v1920: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:42:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:27.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:28.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:29 compute-1 ceph-mon[75484]: pgmap v1921: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:42:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:29.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:29 compute-1 nova_compute[238822]: 2025-09-30 18:42:29.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:30.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:31 compute-1 ceph-mon[75484]: pgmap v1922: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:42:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:31 compute-1 nova_compute[238822]: 2025-09-30 18:42:31.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/474572149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:42:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:32.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:42:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:33 compute-1 ceph-mon[75484]: pgmap v1923: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:42:33 compute-1 sshd-session[299029]: Invalid user superadmin from 8.243.64.201 port 46494
Sep 30 18:42:33 compute-1 sshd-session[299029]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:42:33 compute-1 sshd-session[299029]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:42:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:34.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:34 compute-1 unix_chkpwd[299033]: password check failed for user (root)
Sep 30 18:42:34 compute-1 sshd-session[299027]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:42:34 compute-1 nova_compute[238822]: 2025-09-30 18:42:34.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:35 compute-1 sshd-session[299029]: Failed password for invalid user superadmin from 8.243.64.201 port 46494 ssh2
Sep 30 18:42:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:35.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:35 compute-1 ceph-mon[75484]: pgmap v1924: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:42:35 compute-1 podman[299036]: 2025-09-30 18:42:35.563756217 +0000 UTC m=+0.090172164 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:42:35 compute-1 sshd-session[299029]: Received disconnect from 8.243.64.201 port 46494:11: Bye Bye [preauth]
Sep 30 18:42:35 compute-1 sshd-session[299029]: Disconnected from invalid user superadmin 8.243.64.201 port 46494 [preauth]
Sep 30 18:42:35 compute-1 podman[299035]: 2025-09-30 18:42:35.605077588 +0000 UTC m=+0.135753650 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 18:42:35 compute-1 podman[249638]: time="2025-09-30T18:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:42:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:42:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8360 "" "Go-http-client/1.1"
Sep 30 18:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:36 compute-1 sshd-session[299027]: Failed password for root from 192.210.160.141 port 36472 ssh2
Sep 30 18:42:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:42:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3785611316' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:42:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:42:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3785611316' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:42:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3785611316' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:42:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3785611316' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:42:36 compute-1 nova_compute[238822]: 2025-09-30 18:42:36.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:36.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:37.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:37 compute-1 ceph-mon[75484]: pgmap v1925: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:42:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:42:37 compute-1 sshd-session[299027]: Connection closed by authenticating user root 192.210.160.141 port 36472 [preauth]
Sep 30 18:42:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:38.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:39.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:39 compute-1 ceph-mon[75484]: pgmap v1926: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:42:39 compute-1 nova_compute[238822]: 2025-09-30 18:42:39.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:40 compute-1 sudo[299091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:42:40 compute-1 sudo[299091]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:42:40 compute-1 sudo[299091]: pam_unix(sudo:session): session closed for user root
Sep 30 18:42:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:40 compute-1 podman[299115]: 2025-09-30 18:42:40.096734582 +0000 UTC m=+0.077022741 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:42:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:40.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:41.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:41 compute-1 nova_compute[238822]: 2025-09-30 18:42:41.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:41 compute-1 ceph-mon[75484]: pgmap v1927: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:42.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:43.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:43 compute-1 ceph-mon[75484]: pgmap v1928: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:42:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/78543690' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:42:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2795778409' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:42:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:44.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:44 compute-1 nova_compute[238822]: 2025-09-30 18:42:44.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:45.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:45 compute-1 ceph-mon[75484]: pgmap v1929: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:42:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:46 compute-1 nova_compute[238822]: 2025-09-30 18:42:46.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:46.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:46 compute-1 ceph-mon[75484]: pgmap v1930: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:42:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:47.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:48 compute-1 podman[299145]: 2025-09-30 18:42:48.549584345 +0000 UTC m=+0.093486913 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20250930)
Sep 30 18:42:48 compute-1 podman[299147]: 2025-09-30 18:42:48.573887308 +0000 UTC m=+0.110583173 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:42:48 compute-1 podman[299146]: 2025-09-30 18:42:48.58436555 +0000 UTC m=+0.118880736 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Sep 30 18:42:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:48.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: ERROR   18:42:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: ERROR   18:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: ERROR   18:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: ERROR   18:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: ERROR   18:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:42:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:42:49 compute-1 ceph-mon[75484]: pgmap v1931: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:42:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:49.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:50 compute-1 nova_compute[238822]: 2025-09-30 18:42:50.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:50.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:51 compute-1 ceph-mon[75484]: pgmap v1932: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:42:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:42:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:51.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:42:51 compute-1 nova_compute[238822]: 2025-09-30 18:42:51.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.584 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.584 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.585 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.585 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:42:52 compute-1 nova_compute[238822]: 2025-09-30 18:42:52.585 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:42:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:52.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:42:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/222232711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.035 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:42:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.298 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.300 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.332 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.333 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4654MB free_disk=39.971275329589844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.334 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:53 compute-1 nova_compute[238822]: 2025-09-30 18:42:53.334 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:53.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:53 compute-1 ceph-mon[75484]: pgmap v1933: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:42:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/222232711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:54.416 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:42:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:54.417 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:42:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:42:54.417 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:54 compute-1 nova_compute[238822]: 2025-09-30 18:42:54.473 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:42:54 compute-1 nova_compute[238822]: 2025-09-30 18:42:54.474 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:42:53 up  4:20,  0 user,  load average: 0.42, 0.48, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:42:54 compute-1 nova_compute[238822]: 2025-09-30 18:42:54.500 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:42:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1660351972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:54.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:42:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/166605120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:54 compute-1 nova_compute[238822]: 2025-09-30 18:42:54.972 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:42:54 compute-1 nova_compute[238822]: 2025-09-30 18:42:54.980 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:42:55 compute-1 nova_compute[238822]: 2025-09-30 18:42:55.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:55 compute-1 nova_compute[238822]: 2025-09-30 18:42:55.492 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:42:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:55.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:55 compute-1 ceph-mon[75484]: pgmap v1934: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:42:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/166605120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:56 compute-1 nova_compute[238822]: 2025-09-30 18:42:56.007 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:42:56 compute-1 nova_compute[238822]: 2025-09-30 18:42:56.008 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.673s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:56 compute-1 nova_compute[238822]: 2025-09-30 18:42:56.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:42:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:42:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:56.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:42:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:57.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:57 compute-1 ceph-mon[75484]: pgmap v1935: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:42:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3436248168' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:42:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3436248168' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:58 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:42:58 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:42:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2625569122' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:42:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:42:58.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:59 compute-1 nova_compute[238822]: 2025-09-30 18:42:59.008 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:59 compute-1 nova_compute[238822]: 2025-09-30 18:42:59.008 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:59 compute-1 nova_compute[238822]: 2025-09-30 18:42:59.009 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:42:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:42:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:42:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:42:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:42:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:42:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:42:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:42:59.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:42:59 compute-1 ceph-mon[75484]: pgmap v1936: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 75 op/s
Sep 30 18:43:00 compute-1 nova_compute[238822]: 2025-09-30 18:43:00.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:00 compute-1 sudo[299266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:43:00 compute-1 sudo[299266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:00 compute-1 sudo[299266]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/124713597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:00.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:01.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:01 compute-1 sudo[299295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:43:01 compute-1 sudo[299295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:01 compute-1 sudo[299295]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:01 compute-1 nova_compute[238822]: 2025-09-30 18:43:01.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:01 compute-1 sudo[299320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:43:01 compute-1 sudo[299320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:01 compute-1 ceph-mon[75484]: pgmap v1937: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 65 op/s
Sep 30 18:43:01 compute-1 unix_chkpwd[299352]: password check failed for user (root)
Sep 30 18:43:01 compute-1 sshd-session[299265]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:43:02 compute-1 nova_compute[238822]: 2025-09-30 18:43:02.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:02 compute-1 sudo[299320]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:43:02 compute-1 ceph-mon[75484]: pgmap v1938: 353 pgs: 353 active+clean; 88 MiB data, 374 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 65 op/s
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:43:02 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:43:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:02.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:03.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:03 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:03.545 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:43:03 compute-1 nova_compute[238822]: 2025-09-30 18:43:03.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:03 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:03.546 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:04 compute-1 sshd-session[299265]: Failed password for root from 192.210.160.141 port 36308 ssh2
Sep 30 18:43:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:04.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:05 compute-1 sshd-session[299265]: Connection closed by authenticating user root 192.210.160.141 port 36308 [preauth]
Sep 30 18:43:05 compute-1 nova_compute[238822]: 2025-09-30 18:43:05.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:05 compute-1 nova_compute[238822]: 2025-09-30 18:43:05.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:05 compute-1 ceph-mon[75484]: pgmap v1939: 353 pgs: 353 active+clean; 121 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 120 op/s
Sep 30 18:43:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:05.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:05 compute-1 podman[249638]: time="2025-09-30T18:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:43:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:43:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:06 compute-1 podman[299385]: 2025-09-30 18:43:06.564492243 +0000 UTC m=+0.096989097 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:43:06 compute-1 nova_compute[238822]: 2025-09-30 18:43:06.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:06 compute-1 podman[299384]: 2025-09-30 18:43:06.618296738 +0000 UTC m=+0.154067720 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 18:43:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:06.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:07 compute-1 nova_compute[238822]: 2025-09-30 18:43:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:07 compute-1 sudo[299435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:43:07 compute-1 sudo[299435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:07 compute-1 sudo[299435]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:07 compute-1 ceph-mon[75484]: pgmap v1940: 353 pgs: 353 active+clean; 121 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 146 KiB/s rd, 2.1 MiB/s wr, 56 op/s
Sep 30 18:43:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:43:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:43:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:43:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:07.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:08.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:09 compute-1 ceph-mon[75484]: pgmap v1941: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 163 KiB/s rd, 3.9 MiB/s wr, 83 op/s
Sep 30 18:43:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1509196512' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:43:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2973940694' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:43:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:43:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:09.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:10 compute-1 nova_compute[238822]: 2025-09-30 18:43:10.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:10 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:10.548 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:10 compute-1 podman[299463]: 2025-09-30 18:43:10.560823465 +0000 UTC m=+0.096852174 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:43:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:10.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:11 compute-1 ceph-mon[75484]: pgmap v1942: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 160 KiB/s rd, 3.9 MiB/s wr, 82 op/s
Sep 30 18:43:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:11.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:11 compute-1 nova_compute[238822]: 2025-09-30 18:43:11.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:12 compute-1 sshd-session[299484]: Invalid user ubuntu from 103.153.190.105 port 40464
Sep 30 18:43:12 compute-1 sshd-session[299484]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:43:12 compute-1 sshd-session[299484]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:43:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:13 compute-1 ceph-mon[75484]: pgmap v1943: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 160 KiB/s rd, 3.9 MiB/s wr, 82 op/s
Sep 30 18:43:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:13.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:14.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:14 compute-1 sshd-session[299484]: Failed password for invalid user ubuntu from 103.153.190.105 port 40464 ssh2
Sep 30 18:43:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:15 compute-1 nova_compute[238822]: 2025-09-30 18:43:15.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:15 compute-1 ceph-mon[75484]: pgmap v1944: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 162 KiB/s rd, 3.9 MiB/s wr, 87 op/s
Sep 30 18:43:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:15.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:16 compute-1 sshd-session[299484]: Received disconnect from 103.153.190.105 port 40464:11: Bye Bye [preauth]
Sep 30 18:43:16 compute-1 sshd-session[299484]: Disconnected from invalid user ubuntu 103.153.190.105 port 40464 [preauth]
Sep 30 18:43:16 compute-1 nova_compute[238822]: 2025-09-30 18:43:16.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:16.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:17 compute-1 ceph-mon[75484]: pgmap v1945: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:43:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:17.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:18.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: ERROR   18:43:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: ERROR   18:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: ERROR   18:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: ERROR   18:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: ERROR   18:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:43:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:43:19 compute-1 ceph-mon[75484]: pgmap v1946: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Sep 30 18:43:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:19.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:19 compute-1 podman[299494]: 2025-09-30 18:43:19.593958224 +0000 UTC m=+0.105912317 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:43:19 compute-1 podman[299495]: 2025-09-30 18:43:19.601083446 +0000 UTC m=+0.109251297 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Sep 30 18:43:19 compute-1 podman[299496]: 2025-09-30 18:43:19.602602487 +0000 UTC m=+0.104336955 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:20 compute-1 nova_compute[238822]: 2025-09-30 18:43:20.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:20 compute-1 sudo[299551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:43:20 compute-1 sudo[299551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:20 compute-1 sudo[299551]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:20.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:21 compute-1 ceph-mon[75484]: pgmap v1947: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:43:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:43:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:21.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:43:21 compute-1 nova_compute[238822]: 2025-09-30 18:43:21.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:43:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:22.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:23.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:23 compute-1 ceph-mon[75484]: pgmap v1948: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:24.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:25 compute-1 nova_compute[238822]: 2025-09-30 18:43:25.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:25.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:25 compute-1 ceph-mon[75484]: pgmap v1949: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:26 compute-1 nova_compute[238822]: 2025-09-30 18:43:26.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:26.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:27.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:27 compute-1 ceph-mon[75484]: pgmap v1950: 353 pgs: 353 active+clean; 167 MiB data, 420 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 69 op/s
Sep 30 18:43:27 compute-1 sshd-session[299583]: Invalid user hadoop from 192.210.160.141 port 50404
Sep 30 18:43:27 compute-1 sshd-session[299583]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:43:27 compute-1 sshd-session[299583]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:28.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:29.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:29 compute-1 sshd-session[299583]: Failed password for invalid user hadoop from 192.210.160.141 port 50404 ssh2
Sep 30 18:43:29 compute-1 ceph-mon[75484]: pgmap v1951: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 125 op/s
Sep 30 18:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:30 compute-1 nova_compute[238822]: 2025-09-30 18:43:30.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:30 compute-1 sshd-session[299583]: Connection closed by invalid user hadoop 192.210.160.141 port 50404 [preauth]
Sep 30 18:43:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:30.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:31.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:31 compute-1 nova_compute[238822]: 2025-09-30 18:43:31.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:31 compute-1 ceph-mon[75484]: pgmap v1952: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 145 KiB/s rd, 2.1 MiB/s wr, 56 op/s
Sep 30 18:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:32 compute-1 ceph-mon[75484]: pgmap v1953: 353 pgs: 353 active+clean; 200 MiB data, 445 MiB used, 40 GiB / 40 GiB avail; 145 KiB/s rd, 2.1 MiB/s wr, 56 op/s
Sep 30 18:43:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:43:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:32.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:43:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:43:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:33.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:34.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:35 compute-1 nova_compute[238822]: 2025-09-30 18:43:35.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:35 compute-1 ceph-mon[75484]: pgmap v1954: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 146 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Sep 30 18:43:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:35.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:35 compute-1 podman[249638]: time="2025-09-30T18:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:43:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:43:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8355 "" "Go-http-client/1.1"
Sep 30 18:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:36 compute-1 nova_compute[238822]: 2025-09-30 18:43:36.153 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Creating tmpfile /var/lib/nova/instances/tmpgbao7i09 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:43:36 compute-1 nova_compute[238822]: 2025-09-30 18:43:36.154 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:36 compute-1 nova_compute[238822]: 2025-09-30 18:43:36.161 2 DEBUG nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbao7i09',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:43:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1721697830' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:43:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1721697830' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:43:36 compute-1 nova_compute[238822]: 2025-09-30 18:43:36.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:36.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:37 compute-1 ceph-mon[75484]: pgmap v1955: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 145 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Sep 30 18:43:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:43:37 compute-1 podman[299598]: 2025-09-30 18:43:37.565274632 +0000 UTC m=+0.099166546 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:43:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:37.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:37 compute-1 podman[299597]: 2025-09-30 18:43:37.635438098 +0000 UTC m=+0.175695173 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:38 compute-1 nova_compute[238822]: 2025-09-30 18:43:38.205 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:43:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:43:38 compute-1 unix_chkpwd[299653]: password check failed for user (root)
Sep 30 18:43:38 compute-1 sshd-session[299651]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:43:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:39.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:39 compute-1 ceph-mon[75484]: pgmap v1956: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 146 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Sep 30 18:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:40 compute-1 nova_compute[238822]: 2025-09-30 18:43:40.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:40 compute-1 sshd-session[299651]: Failed password for root from 8.243.64.201 port 39408 ssh2
Sep 30 18:43:40 compute-1 sshd-session[299651]: Received disconnect from 8.243.64.201 port 39408:11: Bye Bye [preauth]
Sep 30 18:43:40 compute-1 sshd-session[299651]: Disconnected from authenticating user root 8.243.64.201 port 39408 [preauth]
Sep 30 18:43:40 compute-1 sudo[299656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:43:40 compute-1 sudo[299656]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:43:40 compute-1 sudo[299656]: pam_unix(sudo:session): session closed for user root
Sep 30 18:43:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:40.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:41.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:41 compute-1 nova_compute[238822]: 2025-09-30 18:43:41.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:41 compute-1 ceph-mon[75484]: pgmap v1957: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:43:41 compute-1 podman[299682]: 2025-09-30 18:43:41.609975294 +0000 UTC m=+0.151659686 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 18:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:42 compute-1 nova_compute[238822]: 2025-09-30 18:43:42.206 2 DEBUG nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbao7i09',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='0bd62f93-1956-4b12-a38a-10deee907b16',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:43:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:42.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:43 compute-1 nova_compute[238822]: 2025-09-30 18:43:43.225 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:43:43 compute-1 nova_compute[238822]: 2025-09-30 18:43:43.226 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:43:43 compute-1 nova_compute[238822]: 2025-09-30 18:43:43.227 2 DEBUG nova.network.neutron [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:43:43 compute-1 ovn_controller[135204]: 2025-09-30T18:43:43Z|00268|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Sep 30 18:43:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:43.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:43 compute-1 ceph-mon[75484]: pgmap v1958: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:43:43 compute-1 nova_compute[238822]: 2025-09-30 18:43:43.744 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:44 compute-1 nova_compute[238822]: 2025-09-30 18:43:44.301 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:44 compute-1 nova_compute[238822]: 2025-09-30 18:43:44.500 2 DEBUG nova.network.neutron [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Updating instance_info_cache with network_info: [{"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:43:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:44.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.011 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.033 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbao7i09',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='0bd62f93-1956-4b12-a38a-10deee907b16',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.035 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Creating instance directory: /var/lib/nova/instances/0bd62f93-1956-4b12-a38a-10deee907b16 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.035 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Ensure instance console log exists: /var/lib/nova/instances/0bd62f93-1956-4b12-a38a-10deee907b16/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.036 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.037 2 DEBUG nova.virt.libvirt.vif [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:42:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1791586084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1791586084',id=32,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:42:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63c45bef63ef4b9f895b3bab865e1a84',ramdisk_id='',reservation_id='r-1jmywl2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:42:49Z,user_data=None,user_id='5717e8cb8548429b948a23763350ab4a',uuid=0bd62f93-1956-4b12-a38a-10deee907b16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.038 2 DEBUG nova.network.os_vif_util [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.039 2 DEBUG nova.network.os_vif_util [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.040 2 DEBUG os_vif [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.041 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.042 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.044 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '990b8b1d-9c6f-5c00-b165-dbb844e49ff5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa362175f-2d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.084 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa362175f-2d, col_values=(('qos', UUID('7097f417-6e45-40ad-a1d7-e701083541e2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.084 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa362175f-2d, col_values=(('external_ids', {'iface-id': 'a362175f-2dba-4a9f-bc07-4260625a8ce0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:c0:eb', 'vm-uuid': '0bd62f93-1956-4b12-a38a-10deee907b16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 NetworkManager[45549]: <info>  [1759257825.0880] manager: (tapa362175f-2d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.098 2 INFO os_vif [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d')
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.099 2 DEBUG nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.099 2 DEBUG nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbao7i09',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='0bd62f93-1956-4b12-a38a-10deee907b16',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:43:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.100 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:45.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:45 compute-1 ceph-mon[75484]: pgmap v1959: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 23 KiB/s wr, 2 op/s
Sep 30 18:43:45 compute-1 nova_compute[238822]: 2025-09-30 18:43:45.941 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:46.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:47.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:47 compute-1 ceph-mon[75484]: pgmap v1960: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:43:47 compute-1 nova_compute[238822]: 2025-09-30 18:43:47.822 2 DEBUG nova.network.neutron [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Port a362175f-2dba-4a9f-bc07-4260625a8ce0 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:43:47 compute-1 nova_compute[238822]: 2025-09-30 18:43:47.839 2 DEBUG nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbao7i09',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='0bd62f93-1956-4b12-a38a-10deee907b16',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:48.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: ERROR   18:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: ERROR   18:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: ERROR   18:43:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: ERROR   18:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: ERROR   18:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:43:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:43:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:49.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:49 compute-1 ceph-mon[75484]: pgmap v1961: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 9.1 KiB/s wr, 2 op/s
Sep 30 18:43:49 compute-1 systemd[1]: Starting dnf makecache...
Sep 30 18:43:49 compute-1 kernel: tapa362175f-2d: entered promiscuous mode
Sep 30 18:43:49 compute-1 nova_compute[238822]: 2025-09-30 18:43:49.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:49 compute-1 NetworkManager[45549]: <info>  [1759257829.9783] manager: (tapa362175f-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Sep 30 18:43:49 compute-1 ovn_controller[135204]: 2025-09-30T18:43:49Z|00269|binding|INFO|Claiming lport a362175f-2dba-4a9f-bc07-4260625a8ce0 for this additional chassis.
Sep 30 18:43:49 compute-1 ovn_controller[135204]: 2025-09-30T18:43:49Z|00270|binding|INFO|a362175f-2dba-4a9f-bc07-4260625a8ce0: Claiming fa:16:3e:52:c0:eb 10.100.0.8
Sep 30 18:43:50 compute-1 systemd-udevd[299764]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:43:50 compute-1 ovn_controller[135204]: 2025-09-30T18:43:50Z|00271|binding|INFO|Setting lport a362175f-2dba-4a9f-bc07-4260625a8ce0 ovn-installed in OVS
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.025 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:c0:eb 10.100.0.8'], port_security=['fa:16:3e:52:c0:eb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0bd62f93-1956-4b12-a38a-10deee907b16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a9025550-4c18-4f21-a560-5b6f52684803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55e305e6-0f4d-40bc-a70b-ac91f882ec57, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a362175f-2dba-4a9f-bc07-4260625a8ce0) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.026 144543 INFO neutron.agent.ovn.metadata.agent [-] Port a362175f-2dba-4a9f-bc07-4260625a8ce0 in datapath c8484b9b-b34e-4c32-b987-029d8fcb2a28 unbound from our chassis
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.028 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:43:50 compute-1 NetworkManager[45549]: <info>  [1759257830.0299] device (tapa362175f-2d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:43:50 compute-1 NetworkManager[45549]: <info>  [1759257830.0309] device (tapa362175f-2d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:43:50 compute-1 systemd-machined[195911]: New machine qemu-25-instance-00000020.
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.043 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fb99d124-8813-4b78-9400-aa531b7b8320]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.044 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8484b9b-b1 in ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:43:50 compute-1 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.047 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8484b9b-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.047 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f805b18a-f972-499a-a309-f14eda0b2107]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.048 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c61ba3a4-db4a-4c26-93be-570aa4ea2c33]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 podman[299712]: 2025-09-30 18:43:50.063985378 +0000 UTC m=+0.135125653 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.063 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[4688ec72-6ac4-4c0f-8ed8-91963695e1fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 podman[299713]: 2025-09-30 18:43:50.073902015 +0000 UTC m=+0.139173402 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.080 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[39b51498-2130-46ca-b8e7-956d9cd1d112]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 podman[299710]: 2025-09-30 18:43:50.081976882 +0000 UTC m=+0.164940624 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.136 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[9a709e52-9d57-49ff-9515-282b8c248715]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.147 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[9a53272b-c63b-48a6-ab73-13ce65fed387]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 NetworkManager[45549]: <info>  [1759257830.1541] manager: (tapc8484b9b-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Sep 30 18:43:50 compute-1 systemd-udevd[299776]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.187 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[89fb1fb8-e5dd-4b12-8a56-4e4afb5e97c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.190 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[9396e27e-584e-40a1-979e-5b9afc6e3beb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 dnf[299719]: Repository 'gating-repo' is missing name in configuration, using id.
Sep 30 18:43:50 compute-1 NetworkManager[45549]: <info>  [1759257830.2282] device (tapc8484b9b-b0): carrier: link connected
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.230 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[dd648b72-a410-499f-a857-8fa5bd2cb43b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.257 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[713d0edc-a26d-46a0-b288-73b05d78284e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8484b9b-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:bc:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1567180, 'reachable_time': 35943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299815, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.275 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cca015cf-f804-457e-bd52-eee432bd9558]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:bccf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1567180, 'tstamp': 1567180}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299817, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-barbican-42b4c41831408a8e323  78 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.302 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ba503de2-fcd3-435c-8cb8-3e68175b5c73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8484b9b-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:bc:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1567180, 'reachable_time': 35943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299818, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 132 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-cinder-1c00d6490d88e436f26ef 134 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.359 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[18ea4cd6-ea37-4f0d-b9c8-d306ae1c5620]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-stevedore-c4acc5639fd2329372142 119 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-cloudkitty-tests-tempest-3961dc  77 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-os-net-config-a7aafa88064e25852eddee77  81 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.465 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d96e14c0-1512-4e32-86c0-ffdfffb450e7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.468 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8484b9b-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.469 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.470 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8484b9b-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:50 compute-1 kernel: tapc8484b9b-b0: entered promiscuous mode
Sep 30 18:43:50 compute-1 NetworkManager[45549]: <info>  [1759257830.4741] manager: (tapc8484b9b-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.479 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8484b9b-b0, col_values=(('external_ids', {'iface-id': 'd2e69f29-6b3a-46dc-9ed7-12031e1b7d2b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ovn_controller[135204]: 2025-09-30T18:43:50Z|00272|binding|INFO|Releasing lport d2e69f29-6b3a-46dc-9ed7-12031e1b7d2b from this chassis (sb_readonly=0)
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.488 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[54baebbc-e729-4db8-a7c7-4d1efa035947]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.488 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.489 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.489 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c8484b9b-b34e-4c32-b987-029d8fcb2a28 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.489 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.490 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c526882d-f9a1-49ca-beb4-daeb1d973270]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.491 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.491 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[789872e7-cbb6-4161-a787-994d9baf213f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.492 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID c8484b9b-b34e-4c32-b987-029d8fcb2a28
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:43:50 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:50.493 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'env', 'PROCESS_TAG=haproxy-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8484b9b-b34e-4c32-b987-029d8fcb2a28.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:43:50 compute-1 nova_compute[238822]: 2025-09-30 18:43:50.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6  66 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-designate-tests-tempest-347fdbc  78 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-glance-1fd12c29b339f30fe823e  88 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 102 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-manila-3c01b7181572c95dac462 109 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-whitebox-neutron-tests-tempest- 109 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-octavia-ba397f07a7331190208c 113 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-watcher-c014f81a8647287f6dcc 115 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-tcib-c895740e59940c0bad2e206b0f 117 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:50.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 112 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-swift-dc98a8463506ac520c469a 108 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-python-tempestconf-8515371b7cceebd4282 121 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: delorean-openstack-heat-ui-013accbfd179753bc3f0 122 kB/s | 3.0 kB     00:00
Sep 30 18:43:50 compute-1 dnf[299719]: gating-repo                                     411 kB/s | 1.5 kB     00:00
Sep 30 18:43:51 compute-1 podman[299910]: 2025-09-30 18:43:51.005780289 +0000 UTC m=+0.094603493 container create f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Sep 30 18:43:51 compute-1 podman[299910]: 2025-09-30 18:43:50.961005456 +0000 UTC m=+0.049828710 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:43:51 compute-1 dnf[299719]: CentOS Stream 9 - BaseOS                         54 kB/s | 7.0 kB     00:00
Sep 30 18:43:51 compute-1 systemd[1]: Started libpod-conmon-f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42.scope.
Sep 30 18:43:51 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:43:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58add3e71cf95a0bb8aec75369e259938adba19d9a5cc2e25688d72649a79cdc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:43:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:51 compute-1 podman[299910]: 2025-09-30 18:43:51.123989546 +0000 UTC m=+0.212812810 container init f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 18:43:51 compute-1 podman[299910]: 2025-09-30 18:43:51.1341892 +0000 UTC m=+0.223012404 container start f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 18:43:51 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [NOTICE]   (299934) : New worker (299936) forked
Sep 30 18:43:51 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [NOTICE]   (299934) : Loading success.
Sep 30 18:43:51 compute-1 dnf[299719]: CentOS Stream 9 - BaseOS                         12 kB/s | 3.9 kB     00:00
Sep 30 18:43:51 compute-1 dnf[299719]: Errors during downloading metadata for repository 'baseos':
Sep 30 18:43:51 compute-1 dnf[299719]:   - Downloading successful, but checksum doesn't match. Calculated: c33587e16099063711748728ddc86ab9a79cc317439cb6505b08596cab68833db56125e364221ebf02cbd6e811e95a16b6342660429b1420843c6481886c67df(sha512)  Expected: 560fcf5558314ba5fd3ab618810735643afd0e6cd44e2ddea354a4e6343cfa9a67c773dae3000fe6d96f56eb046582e723eeae98dc6121fc4692400eee660278(sha512)
Sep 30 18:43:51 compute-1 dnf[299719]: Error: Failed to download metadata for repo 'baseos': Cannot download repomd.xml: Downloading successful, but checksum doesn't match. Calculated: c33587e16099063711748728ddc86ab9a79cc317439cb6505b08596cab68833db56125e364221ebf02cbd6e811e95a16b6342660429b1420843c6481886c67df(sha512)  Expected: 560fcf5558314ba5fd3ab618810735643afd0e6cd44e2ddea354a4e6343cfa9a67c773dae3000fe6d96f56eb046582e723eeae98dc6121fc4692400eee660278(sha512)
Sep 30 18:43:51 compute-1 systemd[1]: dnf-makecache.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 18:43:51 compute-1 systemd[1]: dnf-makecache.service: Failed with result 'exit-code'.
Sep 30 18:43:51 compute-1 systemd[1]: Failed to start dnf makecache.
Sep 30 18:43:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:51 compute-1 ceph-mon[75484]: pgmap v1962: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:43:52 compute-1 nova_compute[238822]: 2025-09-30 18:43:52.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:52 compute-1 nova_compute[238822]: 2025-09-30 18:43:52.056 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:43:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:52.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:53 compute-1 nova_compute[238822]: 2025-09-30 18:43:53.052 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:53 compute-1 ovn_controller[135204]: 2025-09-30T18:43:53Z|00273|binding|INFO|Claiming lport a362175f-2dba-4a9f-bc07-4260625a8ce0 for this chassis.
Sep 30 18:43:53 compute-1 ovn_controller[135204]: 2025-09-30T18:43:53Z|00274|binding|INFO|a362175f-2dba-4a9f-bc07-4260625a8ce0: Claiming fa:16:3e:52:c0:eb 10.100.0.8
Sep 30 18:43:53 compute-1 ovn_controller[135204]: 2025-09-30T18:43:53Z|00275|binding|INFO|Setting lport a362175f-2dba-4a9f-bc07-4260625a8ce0 up in Southbound
Sep 30 18:43:53 compute-1 unix_chkpwd[299950]: password check failed for user (root)
Sep 30 18:43:53 compute-1 sshd-session[299946]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:43:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:53.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:53 compute-1 ceph-mon[75484]: pgmap v1963: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 9.1 KiB/s wr, 1 op/s
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:54.418 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:43:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:54.419 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:43:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:43:54.419 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.516 2 INFO nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Post operation of migration started
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.517 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.573 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:43:54 compute-1 ceph-mon[75484]: pgmap v1964: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 9.2 KiB/s wr, 7 op/s
Sep 30 18:43:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/505879848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:54.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.935 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:54 compute-1 nova_compute[238822]: 2025-09-30 18:43:54.936 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:43:55 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3433659692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.043 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.058 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.058 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.058 2 DEBUG nova.network.neutron [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:43:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:43:55 compute-1 nova_compute[238822]: 2025-09-30 18:43:55.568 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:55.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:55 compute-1 sshd-session[299946]: Failed password for root from 192.210.160.141 port 34186 ssh2
Sep 30 18:43:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3433659692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.102 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:43:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.103 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.321 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.356 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.358 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.401 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.402 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4406MB free_disk=39.901153564453125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.402 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.403 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:43:56 compute-1 nova_compute[238822]: 2025-09-30 18:43:56.514 2 DEBUG nova.network.neutron [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Updating instance_info_cache with network_info: [{"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:43:56 compute-1 ceph-mon[75484]: pgmap v1965: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 1.2 KiB/s wr, 6 op/s
Sep 30 18:43:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:56.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:56 compute-1 sshd-session[299946]: Connection closed by authenticating user root 192.210.160.141 port 34186 [preauth]
Sep 30 18:43:57 compute-1 nova_compute[238822]: 2025-09-30 18:43:57.033 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-0bd62f93-1956-4b12-a38a-10deee907b16" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:43:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:57 compute-1 nova_compute[238822]: 2025-09-30 18:43:57.425 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 0bd62f93-1956-4b12-a38a-10deee907b16 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:43:57 compute-1 nova_compute[238822]: 2025-09-30 18:43:57.561 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:43:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:57.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/323189244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3052451571' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:43:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3052451571' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:43:57 compute-1 nova_compute[238822]: 2025-09-30 18:43:57.933 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Updating resource usage from migration 5921f771-62ce-4f71-be1d-1e67d936f2cc
Sep 30 18:43:57 compute-1 nova_compute[238822]: 2025-09-30 18:43:57.934 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Starting to track incoming migration 5921f771-62ce-4f71-be1d-1e67d936f2cc with flavor dc3a14e6-3544-428c-a856-1da19a12bf48 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:58 compute-1 ceph-mon[75484]: pgmap v1966: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 3.5 KiB/s wr, 6 op/s
Sep 30 18:43:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:43:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:43:58.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:43:58 compute-1 nova_compute[238822]: 2025-09-30 18:43:58.972 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 0bd62f93-1956-4b12-a38a-10deee907b16 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}.
Sep 30 18:43:58 compute-1 nova_compute[238822]: 2025-09-30 18:43:58.973 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:43:58 compute-1 nova_compute[238822]: 2025-09-30 18:43:58.974 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:43:56 up  4:21,  0 user,  load average: 0.51, 0.50, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:43:59 compute-1 nova_compute[238822]: 2025-09-30 18:43:59.015 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:43:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:43:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:43:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:43:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:43:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:43:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/285807711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:43:59 compute-1 nova_compute[238822]: 2025-09-30 18:43:59.557 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:43:59 compute-1 nova_compute[238822]: 2025-09-30 18:43:59.564 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:43:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:43:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:43:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:43:59.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:43:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/285807711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.073 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:00 compute-1 sudo[300005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:44:00 compute-1 sudo[300005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:00 compute-1 sudo[300005]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.587 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.589 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.186s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.590 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 3.028s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.590 2 DEBUG oslo_concurrency.lockutils [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:00 compute-1 nova_compute[238822]: 2025-09-30 18:44:00.599 2 INFO nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:44:00 compute-1 virtqemud[239124]: Domain id=25 name='instance-00000020' uuid=0bd62f93-1956-4b12-a38a-10deee907b16 is tainted: custom-monitor
Sep 30 18:44:00 compute-1 ceph-mon[75484]: pgmap v1967: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:44:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:00.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:01 compute-1 nova_compute[238822]: 2025-09-30 18:44:01.592 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:01 compute-1 nova_compute[238822]: 2025-09-30 18:44:01.593 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:01 compute-1 nova_compute[238822]: 2025-09-30 18:44:01.593 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:01 compute-1 nova_compute[238822]: 2025-09-30 18:44:01.611 2 INFO nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:44:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:01.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:02 compute-1 nova_compute[238822]: 2025-09-30 18:44:02.619 2 INFO nova.virt.libvirt.driver [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:44:02 compute-1 nova_compute[238822]: 2025-09-30 18:44:02.626 2 DEBUG nova.compute.manager [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:44:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:02.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:03 compute-1 nova_compute[238822]: 2025-09-30 18:44:03.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:03 compute-1 nova_compute[238822]: 2025-09-30 18:44:03.139 2 DEBUG nova.objects.instance [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:44:03 compute-1 ceph-mon[75484]: pgmap v1968: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:44:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:03.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:04 compute-1 nova_compute[238822]: 2025-09-30 18:44:04.164 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:44:04 compute-1 nova_compute[238822]: 2025-09-30 18:44:04.284 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:44:04 compute-1 nova_compute[238822]: 2025-09-30 18:44:04.285 2 WARNING neutronclient.v2_0.client [None req-76722a76-1096-43b9-8cfa-ba0e0265636b 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:44:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:04.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:05 compute-1 nova_compute[238822]: 2025-09-30 18:44:05.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:44:05 compute-1 ceph-mon[75484]: pgmap v1969: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 2.5 KiB/s wr, 6 op/s
Sep 30 18:44:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:05.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:05 compute-1 podman[249638]: time="2025-09-30T18:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:44:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:44:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8826 "" "Go-http-client/1.1"
Sep 30 18:44:06 compute-1 nova_compute[238822]: 2025-09-30 18:44:06.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3120071903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:06.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:07 compute-1 nova_compute[238822]: 2025-09-30 18:44:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:07 compute-1 sudo[300038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:44:07 compute-1 sudo[300038]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:07 compute-1 sudo[300038]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:07 compute-1 sudo[300063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:44:07 compute-1 sudo[300063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:07 compute-1 ceph-mon[75484]: pgmap v1970: 353 pgs: 353 active+clean; 200 MiB data, 446 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:44:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:44:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:44:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:07.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:44:08 compute-1 sudo[300063]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:08.015 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:44:08 compute-1 nova_compute[238822]: 2025-09-30 18:44:08.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:08.017 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:08 compute-1 podman[300123]: 2025-09-30 18:44:08.557066035 +0000 UTC m=+0.087541544 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:44:08 compute-1 podman[300122]: 2025-09-30 18:44:08.607018198 +0000 UTC m=+0.142096510 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:44:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:08.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:09.019 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:44:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:09 compute-1 ceph-mon[75484]: pgmap v1971: 353 pgs: 353 active+clean; 200 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:44:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3944740418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:09.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:10 compute-1 nova_compute[238822]: 2025-09-30 18:44:10.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:10 compute-1 nova_compute[238822]: 2025-09-30 18:44:10.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:11 compute-1 ceph-mon[75484]: pgmap v1972: 353 pgs: 353 active+clean; 200 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:44:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:44:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:11.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:12 compute-1 podman[300177]: 2025-09-30 18:44:12.545212719 +0000 UTC m=+0.077234527 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:44:12 compute-1 ceph-mon[75484]: pgmap v1973: 353 pgs: 353 active+clean; 200 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 747 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:44:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/470898929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:13.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:14 compute-1 ceph-mon[75484]: pgmap v1974: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.3 KiB/s wr, 31 op/s
Sep 30 18:44:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:14.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.265 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "0bd62f93-1956-4b12-a38a-10deee907b16" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.265 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.266 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.266 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.267 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.281 2 INFO nova.compute.manager [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Terminating instance
Sep 30 18:44:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:15.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.802 2 DEBUG nova.compute.manager [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:44:15 compute-1 kernel: tapa362175f-2d (unregistering): left promiscuous mode
Sep 30 18:44:15 compute-1 NetworkManager[45549]: <info>  [1759257855.8451] device (tapa362175f-2d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:15 compute-1 ovn_controller[135204]: 2025-09-30T18:44:15Z|00276|binding|INFO|Releasing lport a362175f-2dba-4a9f-bc07-4260625a8ce0 from this chassis (sb_readonly=0)
Sep 30 18:44:15 compute-1 ovn_controller[135204]: 2025-09-30T18:44:15Z|00277|binding|INFO|Setting lport a362175f-2dba-4a9f-bc07-4260625a8ce0 down in Southbound
Sep 30 18:44:15 compute-1 ovn_controller[135204]: 2025-09-30T18:44:15Z|00278|binding|INFO|Removing iface tapa362175f-2d ovn-installed in OVS
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:15.863 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:c0:eb 10.100.0.8'], port_security=['fa:16:3e:52:c0:eb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0bd62f93-1956-4b12-a38a-10deee907b16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63c45bef63ef4b9f895b3bab865e1a84', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'a9025550-4c18-4f21-a560-5b6f52684803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55e305e6-0f4d-40bc-a70b-ac91f882ec57, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=a362175f-2dba-4a9f-bc07-4260625a8ce0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:44:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:15.864 144543 INFO neutron.agent.ovn.metadata.agent [-] Port a362175f-2dba-4a9f-bc07-4260625a8ce0 in datapath c8484b9b-b34e-4c32-b987-029d8fcb2a28 unbound from our chassis
Sep 30 18:44:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:15.865 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8484b9b-b34e-4c32-b987-029d8fcb2a28, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:44:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:15.868 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ef689f30-ae4e-4f19-ab02-4eb9c8ebdff8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:15.868 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 namespace which is not needed anymore
Sep 30 18:44:15 compute-1 nova_compute[238822]: 2025-09-30 18:44:15.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:15 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Sep 30 18:44:15 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 3.346s CPU time.
Sep 30 18:44:15 compute-1 systemd-machined[195911]: Machine qemu-25-instance-00000020 terminated.
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.048 2 INFO nova.virt.libvirt.driver [-] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Instance destroyed successfully.
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.049 2 DEBUG nova.objects.instance [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lazy-loading 'resources' on Instance uuid 0bd62f93-1956-4b12-a38a-10deee907b16 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:44:16 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [NOTICE]   (299934) : haproxy version is 3.0.5-8e879a5
Sep 30 18:44:16 compute-1 podman[300225]: 2025-09-30 18:44:16.088103896 +0000 UTC m=+0.055040850 container kill f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true)
Sep 30 18:44:16 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [NOTICE]   (299934) : path to executable is /usr/sbin/haproxy
Sep 30 18:44:16 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [WARNING]  (299934) : Exiting Master process...
Sep 30 18:44:16 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [ALERT]    (299934) : Current worker (299936) exited with code 143 (Terminated)
Sep 30 18:44:16 compute-1 neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28[299929]: [WARNING]  (299934) : All workers exited. Exiting... (0)
Sep 30 18:44:16 compute-1 systemd[1]: libpod-f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42.scope: Deactivated successfully.
Sep 30 18:44:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:16 compute-1 podman[300250]: 2025-09-30 18:44:16.154593223 +0000 UTC m=+0.039634166 container died f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:44:16 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42-userdata-shm.mount: Deactivated successfully.
Sep 30 18:44:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-58add3e71cf95a0bb8aec75369e259938adba19d9a5cc2e25688d72649a79cdc-merged.mount: Deactivated successfully.
Sep 30 18:44:16 compute-1 podman[300250]: 2025-09-30 18:44:16.216881657 +0000 UTC m=+0.101922500 container cleanup f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:44:16 compute-1 systemd[1]: libpod-conmon-f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42.scope: Deactivated successfully.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.241493) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856241524, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 2317, "num_deletes": 256, "total_data_size": 5902711, "memory_usage": 5984200, "flush_reason": "Manual Compaction"}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Sep 30 18:44:16 compute-1 podman[300252]: 2025-09-30 18:44:16.245174288 +0000 UTC m=+0.117381376 container remove f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.253 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c61aed08-6a20-4dac-9181-af6c1c779970]: (4, ("Tue Sep 30 06:44:16 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 (f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42)\nf3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42\nTue Sep 30 06:44:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 (f3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42)\nf3255bb0c60e98dbc86b2708afce939c43ad77f24f1958e85fdb2b3c56aacc42\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.256 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e61da2ed-96a2-47e8-bea1-8d6f98b68705]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.256 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8484b9b-b34e-4c32-b987-029d8fcb2a28.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.257 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d4de8560-2c1f-4e23-b1b3-b1ddb036f75c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.258 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8484b9b-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 kernel: tapc8484b9b-b0: left promiscuous mode
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856266132, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 3796768, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52118, "largest_seqno": 54430, "table_properties": {"data_size": 3787484, "index_size": 5778, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19289, "raw_average_key_size": 20, "raw_value_size": 3768848, "raw_average_value_size": 3929, "num_data_blocks": 252, "num_entries": 959, "num_filter_entries": 959, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257654, "oldest_key_time": 1759257654, "file_creation_time": 1759257856, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 24725 microseconds, and 10071 cpu microseconds.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.266205) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 3796768 bytes OK
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.266238) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.268241) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.268266) EVENT_LOG_v1 {"time_micros": 1759257856268258, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.268291) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 5892380, prev total WAL file size 5892380, number of live WAL files 2.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.270561) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373536' seq:72057594037927935, type:22 .. '6C6F676D0032303038' seq:0, type:0; will stop at (end)
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(3707KB)], [105(11MB)]
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856270660, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 15655810, "oldest_snapshot_seqno": -1}
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.296 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2173fc31-d146-4e8e-9759-fc1d76476df1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.320 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[16eb3625-9ff3-490f-8de8-e0a804d0ead5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.322 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8729a256-c44b-4b24-a836-622c31044edd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.349 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7abbfe-5b90-4a04-a5aa-1a3f35844933]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1567170, 'reachable_time': 31547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300288, 'error': None, 'target': 'ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 systemd[1]: run-netns-ovnmeta\x2dc8484b9b\x2db34e\x2d4c32\x2db987\x2d029d8fcb2a28.mount: Deactivated successfully.
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.355 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8484b9b-b34e-4c32-b987-029d8fcb2a28 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:44:16 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:16.355 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[3baf6e31-e7fc-4cb4-b925-2177abac83b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7507 keys, 15511525 bytes, temperature: kUnknown
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856369979, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 15511525, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15461615, "index_size": 30022, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18821, "raw_key_size": 196990, "raw_average_key_size": 26, "raw_value_size": 15327425, "raw_average_value_size": 2041, "num_data_blocks": 1188, "num_entries": 7507, "num_filter_entries": 7507, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257856, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.370313) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 15511525 bytes
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.371974) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.5 rd, 156.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.3 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(8.2) write-amplify(4.1) OK, records in: 8035, records dropped: 528 output_compression: NoCompression
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.372003) EVENT_LOG_v1 {"time_micros": 1759257856371990, "job": 66, "event": "compaction_finished", "compaction_time_micros": 99414, "compaction_time_cpu_micros": 61393, "output_level": 6, "num_output_files": 1, "total_output_size": 15511525, "num_input_records": 8035, "num_output_records": 7507, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856373391, "job": 66, "event": "table_file_deletion", "file_number": 107}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257856377339, "job": 66, "event": "table_file_deletion", "file_number": 105}
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.270446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.377437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.377445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.377448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.377451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:16.377453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.557 2 DEBUG nova.virt.libvirt.vif [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:42:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1791586084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1791586084',id=32,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:42:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63c45bef63ef4b9f895b3bab865e1a84',ramdisk_id='',reservation_id='r-1jmywl2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-134702932-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:44:03Z,user_data=None,user_id='5717e8cb8548429b948a23763350ab4a',uuid=0bd62f93-1956-4b12-a38a-10deee907b16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.557 2 DEBUG nova.network.os_vif_util [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Converting VIF {"id": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "address": "fa:16:3e:52:c0:eb", "network": {"id": "c8484b9b-b34e-4c32-b987-029d8fcb2a28", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-862913257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c9029a2856de43388bcee1a38d165449", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa362175f-2d", "ovs_interfaceid": "a362175f-2dba-4a9f-bc07-4260625a8ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.559 2 DEBUG nova.network.os_vif_util [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.559 2 DEBUG os_vif [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.562 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa362175f-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7097f417-6e45-40ad-a1d7-e701083541e2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:16 compute-1 nova_compute[238822]: 2025-09-30 18:44:16.574 2 INFO os_vif [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:c0:eb,bridge_name='br-int',has_traffic_filtering=True,id=a362175f-2dba-4a9f-bc07-4260625a8ce0,network=Network(c8484b9b-b34e-4c32-b987-029d8fcb2a28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa362175f-2d')
Sep 30 18:44:16 compute-1 ceph-mon[75484]: pgmap v1975: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.3 KiB/s wr, 30 op/s
Sep 30 18:44:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:44:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:16.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:44:16 compute-1 sudo[300308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:44:16 compute-1 sudo[300308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:16 compute-1 sudo[300308]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.046 2 INFO nova.virt.libvirt.driver [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Deleting instance files /var/lib/nova/instances/0bd62f93-1956-4b12-a38a-10deee907b16_del
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.047 2 INFO nova.virt.libvirt.driver [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Deletion of /var/lib/nova/instances/0bd62f93-1956-4b12-a38a-10deee907b16_del complete
Sep 30 18:44:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.561 2 INFO nova.compute.manager [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Took 1.76 seconds to destroy the instance on the hypervisor.
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.562 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.563 2 DEBUG nova.compute.manager [-] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.563 2 DEBUG nova.network.neutron [-] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.564 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:44:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:17.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.651 2 DEBUG nova.compute.manager [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Received event network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.651 2 DEBUG oslo_concurrency.lockutils [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.652 2 DEBUG oslo_concurrency.lockutils [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.652 2 DEBUG oslo_concurrency.lockutils [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.653 2 DEBUG nova.compute.manager [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] No waiting events found dispatching network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.653 2 DEBUG nova.compute.manager [req-d9251b3e-5b95-4d75-bb75-67733647e89c req-47cc0e4f-ad9b-48bb-82ed-00e3092190ac 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Received event network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:44:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:17 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:44:17 compute-1 ceph-mon[75484]: pgmap v1976: 353 pgs: 353 active+clean; 121 MiB data, 403 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.3 KiB/s wr, 30 op/s
Sep 30 18:44:17 compute-1 nova_compute[238822]: 2025-09-30 18:44:17.728 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.739138) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857739205, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 291, "num_deletes": 251, "total_data_size": 143938, "memory_usage": 150136, "flush_reason": "Manual Compaction"}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857742852, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 94629, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54435, "largest_seqno": 54721, "table_properties": {"data_size": 92695, "index_size": 162, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5046, "raw_average_key_size": 18, "raw_value_size": 88883, "raw_average_value_size": 326, "num_data_blocks": 6, "num_entries": 272, "num_filter_entries": 272, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257856, "oldest_key_time": 1759257856, "file_creation_time": 1759257857, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 3778 microseconds, and 1528 cpu microseconds.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.742925) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 94629 bytes OK
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.742949) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.745127) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.745154) EVENT_LOG_v1 {"time_micros": 1759257857745146, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.745180) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 141770, prev total WAL file size 141770, number of live WAL files 2.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.745783) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(92KB)], [108(14MB)]
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857745844, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 15606154, "oldest_snapshot_seqno": -1}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7269 keys, 13647031 bytes, temperature: kUnknown
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857835916, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 13647031, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13600256, "index_size": 27527, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18181, "raw_key_size": 192736, "raw_average_key_size": 26, "raw_value_size": 13471566, "raw_average_value_size": 1853, "num_data_blocks": 1076, "num_entries": 7269, "num_filter_entries": 7269, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257857, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.836924) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 13647031 bytes
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.838666) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.1 rd, 151.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 14.8 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(309.1) write-amplify(144.2) OK, records in: 7779, records dropped: 510 output_compression: NoCompression
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.838708) EVENT_LOG_v1 {"time_micros": 1759257857838691, "job": 68, "event": "compaction_finished", "compaction_time_micros": 90173, "compaction_time_cpu_micros": 57263, "output_level": 6, "num_output_files": 1, "total_output_size": 13647031, "num_input_records": 7779, "num_output_records": 7269, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857838902, "job": 68, "event": "table_file_deletion", "file_number": 110}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257857843828, "job": 68, "event": "table_file_deletion", "file_number": 108}
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.745595) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.843960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.843972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.843976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.843979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:17 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:17.843982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:18.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.195 2 DEBUG nova.network.neutron [-] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: ERROR   18:44:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: ERROR   18:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: ERROR   18:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: ERROR   18:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: ERROR   18:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:44:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:44:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:19.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.701 2 INFO nova.compute.manager [-] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Took 2.14 seconds to deallocate network for instance.
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.749 2 DEBUG nova.compute.manager [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Received event network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.749 2 DEBUG oslo_concurrency.lockutils [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.750 2 DEBUG oslo_concurrency.lockutils [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.750 2 DEBUG oslo_concurrency.lockutils [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.750 2 DEBUG nova.compute.manager [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] No waiting events found dispatching network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.751 2 DEBUG nova.compute.manager [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Received event network-vif-unplugged-a362175f-2dba-4a9f-bc07-4260625a8ce0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:44:19 compute-1 nova_compute[238822]: 2025-09-30 18:44:19.751 2 DEBUG nova.compute.manager [req-bc1560fe-59d3-4d71-a8f7-9ecdfe057183 req-a7047533-c884-4753-8df0-5c928bf80dae 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 0bd62f93-1956-4b12-a38a-10deee907b16] Received event network-vif-deleted-a362175f-2dba-4a9f-bc07-4260625a8ce0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:44:19 compute-1 unix_chkpwd[300338]: password check failed for user (root)
Sep 30 18:44:19 compute-1 sshd-session[300334]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:44:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:20 compute-1 nova_compute[238822]: 2025-09-30 18:44:20.229 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:20 compute-1 nova_compute[238822]: 2025-09-30 18:44:20.230 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:20 compute-1 nova_compute[238822]: 2025-09-30 18:44:20.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:20 compute-1 nova_compute[238822]: 2025-09-30 18:44:20.256 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.026s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:20 compute-1 nova_compute[238822]: 2025-09-30 18:44:20.297 2 INFO nova.scheduler.client.report [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Deleted allocations for instance 0bd62f93-1956-4b12-a38a-10deee907b16
Sep 30 18:44:20 compute-1 ceph-mon[75484]: pgmap v1977: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 41 KiB/s rd, 2.6 KiB/s wr, 60 op/s
Sep 30 18:44:20 compute-1 sudo[300360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:44:20 compute-1 sudo[300360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:20 compute-1 sudo[300360]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:20 compute-1 podman[300340]: 2025-09-30 18:44:20.603395207 +0000 UTC m=+0.131927426 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:44:20 compute-1 podman[300341]: 2025-09-30 18:44:20.609905222 +0000 UTC m=+0.130895269 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Sep 30 18:44:20 compute-1 podman[300342]: 2025-09-30 18:44:20.614965848 +0000 UTC m=+0.128067163 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:44:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:20.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:21 compute-1 nova_compute[238822]: 2025-09-30 18:44:21.339 2 DEBUG oslo_concurrency.lockutils [None req-308470bb-5509-4edc-9646-cf0256463dc9 5717e8cb8548429b948a23763350ab4a 63c45bef63ef4b9f895b3bab865e1a84 - - default default] Lock "0bd62f93-1956-4b12-a38a-10deee907b16" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.074s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:21 compute-1 nova_compute[238822]: 2025-09-30 18:44:21.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:44:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:21.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:44:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:22 compute-1 sshd-session[300334]: Failed password for root from 192.210.160.141 port 34404 ssh2
Sep 30 18:44:22 compute-1 ceph-mon[75484]: pgmap v1978: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 41 KiB/s rd, 2.6 KiB/s wr, 60 op/s
Sep 30 18:44:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:44:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:22.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:23 compute-1 sshd-session[300334]: Connection closed by authenticating user root 192.210.160.141 port 34404 [preauth]
Sep 30 18:44:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:23.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:24 compute-1 ceph-mon[75484]: pgmap v1979: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Sep 30 18:44:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:24.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:25 compute-1 nova_compute[238822]: 2025-09-30 18:44:25.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:25.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:26 compute-1 ceph-mon[75484]: pgmap v1980: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:44:26 compute-1 nova_compute[238822]: 2025-09-30 18:44:26.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:26 compute-1 sshd-session[300281]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:44:26 compute-1 sshd-session[300281]: banner exchange: Connection from 110.42.70.108 port 51466: Connection timed out
Sep 30 18:44:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:27.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:27 compute-1 nova_compute[238822]: 2025-09-30 18:44:27.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:28 compute-1 ceph-mon[75484]: pgmap v1981: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Sep 30 18:44:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:28.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:29 compute-1 unix_chkpwd[300432]: password check failed for user (root)
Sep 30 18:44:29 compute-1 sshd-session[300430]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17  user=root
Sep 30 18:44:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:29.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:30 compute-1 nova_compute[238822]: 2025-09-30 18:44:30.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:30 compute-1 ceph-mon[75484]: pgmap v1982: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:44:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:30.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:31 compute-1 sshd-session[300430]: Failed password for root from 161.132.50.17 port 51802 ssh2
Sep 30 18:44:31 compute-1 nova_compute[238822]: 2025-09-30 18:44:31.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:31.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:32 compute-1 sshd-session[300430]: Received disconnect from 161.132.50.17 port 51802:11: Bye Bye [preauth]
Sep 30 18:44:32 compute-1 sshd-session[300430]: Disconnected from authenticating user root 161.132.50.17 port 51802 [preauth]
Sep 30 18:44:32 compute-1 ceph-mon[75484]: pgmap v1983: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:32.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:33.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:34 compute-1 ceph-mon[75484]: pgmap v1984: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:44:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:34.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:35 compute-1 nova_compute[238822]: 2025-09-30 18:44:35.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:35 compute-1 podman[249638]: time="2025-09-30T18:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:44:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:44:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:35.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:44:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:36 compute-1 nova_compute[238822]: 2025-09-30 18:44:36.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:36 compute-1 ceph-mon[75484]: pgmap v1985: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2356262577' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:44:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2356262577' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:44:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:36.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:37.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:38 compute-1 ceph-mon[75484]: pgmap v1986: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:38.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:39.421 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:31:e0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4a724d83-7a03-449e-b06f-f9f1f1bb686e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a724d83-7a03-449e-b06f-f9f1f1bb686e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c994362300fc4b68b72392279f890ca7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=793ccd68-a96c-4ced-8449-bfb1c479c4b4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d979d9c2-641a-4559-b95a-f55833182093) old=Port_Binding(mac=['fa:16:3e:1e:31:e0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4a724d83-7a03-449e-b06f-f9f1f1bb686e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a724d83-7a03-449e-b06f-f9f1f1bb686e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c994362300fc4b68b72392279f890ca7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:44:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:39.422 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d979d9c2-641a-4559-b95a-f55833182093 in datapath 4a724d83-7a03-449e-b06f-f9f1f1bb686e updated
Sep 30 18:44:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:39.423 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a724d83-7a03-449e-b06f-f9f1f1bb686e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:44:39 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:39.424 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[34c5a78e-49b5-4b1a-8291-92aacfb7e9c1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:39 compute-1 podman[300444]: 2025-09-30 18:44:39.559037777 +0000 UTC m=+0.095954770 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:44:39 compute-1 podman[300443]: 2025-09-30 18:44:39.605825574 +0000 UTC m=+0.146255522 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 18:44:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:39.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:40 compute-1 nova_compute[238822]: 2025-09-30 18:44:40.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:40 compute-1 sudo[300496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:44:40 compute-1 sudo[300496]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:44:40 compute-1 sudo[300496]: pam_unix(sudo:session): session closed for user root
Sep 30 18:44:40 compute-1 ceph-mon[75484]: pgmap v1987: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 766 B/s rd, 0 op/s
Sep 30 18:44:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:40.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:41 compute-1 nova_compute[238822]: 2025-09-30 18:44:41.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:41.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:41 compute-1 ceph-mon[75484]: pgmap v1988: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:42.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:43 compute-1 podman[300528]: 2025-09-30 18:44:43.542056991 +0000 UTC m=+0.075183892 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:44:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:43.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:43 compute-1 unix_chkpwd[300547]: password check failed for user (root)
Sep 30 18:44:43 compute-1 sshd-session[300523]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 18:44:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:44 compute-1 ceph-mon[75484]: pgmap v1989: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 766 B/s rd, 0 op/s
Sep 30 18:44:44 compute-1 unix_chkpwd[300551]: password check failed for user (root)
Sep 30 18:44:44 compute-1 sshd-session[300525]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:44:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:44.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:45 compute-1 unix_chkpwd[300553]: password check failed for user (root)
Sep 30 18:44:45 compute-1 sshd-session[300549]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:44:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:45 compute-1 nova_compute[238822]: 2025-09-30 18:44:45.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:45.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:46 compute-1 sshd-session[300523]: Failed password for root from 49.49.32.245 port 42532 ssh2
Sep 30 18:44:46 compute-1 ceph-mon[75484]: pgmap v1990: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:46 compute-1 sshd-session[300525]: Failed password for root from 192.210.160.141 port 55724 ssh2
Sep 30 18:44:46 compute-1 nova_compute[238822]: 2025-09-30 18:44:46.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:46 compute-1 sshd-session[300523]: Received disconnect from 49.49.32.245 port 42532:11: Bye Bye [preauth]
Sep 30 18:44:46 compute-1 sshd-session[300523]: Disconnected from authenticating user root 49.49.32.245 port 42532 [preauth]
Sep 30 18:44:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:46.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:46 compute-1 sshd-session[300549]: Failed password for root from 8.243.64.201 port 45604 ssh2
Sep 30 18:44:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:47.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:47 compute-1 sshd-session[300549]: Received disconnect from 8.243.64.201 port 45604:11: Bye Bye [preauth]
Sep 30 18:44:47 compute-1 sshd-session[300549]: Disconnected from authenticating user root 8.243.64.201 port 45604 [preauth]
Sep 30 18:44:47 compute-1 sshd-session[300525]: Connection closed by authenticating user root 192.210.160.141 port 55724 [preauth]
Sep 30 18:44:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:47.971 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:44:38 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f866cf0f-e793-4238-baa0-f0c3f17801af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f866cf0f-e793-4238-baa0-f0c3f17801af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e1882e8f3e74aa3840e38f2ce263f25', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7288d4-137d-4796-adf2-791325730c64, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2539ca47-e3f3-40f9-9a18-3cc4160bcb4f) old=Port_Binding(mac=['fa:16:3e:65:44:38'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f866cf0f-e793-4238-baa0-f0c3f17801af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f866cf0f-e793-4238-baa0-f0c3f17801af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e1882e8f3e74aa3840e38f2ce263f25', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:44:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:47.972 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2539ca47-e3f3-40f9-9a18-3cc4160bcb4f in datapath f866cf0f-e793-4238-baa0-f0c3f17801af updated
Sep 30 18:44:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:47.973 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f866cf0f-e793-4238-baa0-f0c3f17801af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:44:47 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:47.974 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[407d2f56-6f1c-45b9-8202-658957fc8159]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:44:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:48 compute-1 ceph-mon[75484]: pgmap v1991: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:48.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: ERROR   18:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: ERROR   18:44:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: ERROR   18:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: ERROR   18:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: ERROR   18:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:44:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:44:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:49.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:50 compute-1 nova_compute[238822]: 2025-09-30 18:44:50.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:50 compute-1 ceph-mon[75484]: pgmap v1992: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:44:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:50.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:51 compute-1 podman[300560]: 2025-09-30 18:44:51.545916686 +0000 UTC m=+0.087951245 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 18:44:51 compute-1 podman[300561]: 2025-09-30 18:44:51.550434117 +0000 UTC m=+0.088585752 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=)
Sep 30 18:44:51 compute-1 podman[300562]: 2025-09-30 18:44:51.577245518 +0000 UTC m=+0.101470378 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:44:51 compute-1 nova_compute[238822]: 2025-09-30 18:44:51.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:51.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:52 compute-1 ceph-mon[75484]: pgmap v1993: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:44:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:52.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:53.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:54.421 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:54.421 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:44:54.421 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:54 compute-1 ceph-mon[75484]: pgmap v1994: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:44:54 compute-1 nova_compute[238822]: 2025-09-30 18:44:54.577 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:44:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:54.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:44:55 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3303240435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.043 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:44:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.295 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.296 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.338 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.340 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4672MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.340 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:44:55 compute-1 nova_compute[238822]: 2025-09-30 18:44:55.341 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:44:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3303240435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1140021245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:55.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.254951) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896255012, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 587, "num_deletes": 252, "total_data_size": 1013092, "memory_usage": 1024512, "flush_reason": "Manual Compaction"}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896259486, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 449251, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54726, "largest_seqno": 55308, "table_properties": {"data_size": 446580, "index_size": 707, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7169, "raw_average_key_size": 20, "raw_value_size": 441126, "raw_average_value_size": 1253, "num_data_blocks": 32, "num_entries": 352, "num_filter_entries": 352, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257857, "oldest_key_time": 1759257857, "file_creation_time": 1759257896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 4578 microseconds, and 2000 cpu microseconds.
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.259535) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 449251 bytes OK
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.259560) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.261692) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.261714) EVENT_LOG_v1 {"time_micros": 1759257896261707, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.261736) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 1009763, prev total WAL file size 1009763, number of live WAL files 2.
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.262717) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373535' seq:72057594037927935, type:22 .. '6D6772737461740032303038' seq:0, type:0; will stop at (end)
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(438KB)], [111(13MB)]
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896262757, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 14096282, "oldest_snapshot_seqno": -1}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 7123 keys, 10457300 bytes, temperature: kUnknown
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896325249, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10457300, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10415856, "index_size": 22516, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17861, "raw_key_size": 189885, "raw_average_key_size": 26, "raw_value_size": 10294116, "raw_average_value_size": 1445, "num_data_blocks": 868, "num_entries": 7123, "num_filter_entries": 7123, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759257896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.325467) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10457300 bytes
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.326696) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.3 rd, 167.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.0 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(54.7) write-amplify(23.3) OK, records in: 7621, records dropped: 498 output_compression: NoCompression
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.326730) EVENT_LOG_v1 {"time_micros": 1759257896326714, "job": 70, "event": "compaction_finished", "compaction_time_micros": 62555, "compaction_time_cpu_micros": 40826, "output_level": 6, "num_output_files": 1, "total_output_size": 10457300, "num_input_records": 7621, "num_output_records": 7123, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896327048, "job": 70, "event": "table_file_deletion", "file_number": 113}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759257896331270, "job": 70, "event": "table_file_deletion", "file_number": 111}
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.262493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.331327) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.331333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.331336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.331339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:44:56.331342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.397 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.398 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:44:55 up  4:22,  0 user,  load average: 0.39, 0.47, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.413 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:44:56 compute-1 ceph-mon[75484]: pgmap v1995: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:44:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:44:56 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2466129853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.903 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:44:56 compute-1 nova_compute[238822]: 2025-09-30 18:44:56.908 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:44:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:56.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:57 compute-1 nova_compute[238822]: 2025-09-30 18:44:57.417 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:44:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2466129853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3713725337' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:44:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:57.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:57 compute-1 nova_compute[238822]: 2025-09-30 18:44:57.930 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:44:57 compute-1 nova_compute[238822]: 2025-09-30 18:44:57.931 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:44:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:58 compute-1 ceph-mon[75484]: pgmap v1996: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:44:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/488943288' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:44:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/488943288' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:44:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:44:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:44:58.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:44:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:44:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:44:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:44:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:44:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:44:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:44:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:44:59.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:44:59 compute-1 nova_compute[238822]: 2025-09-30 18:44:59.932 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:59 compute-1 nova_compute[238822]: 2025-09-30 18:44:59.933 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:44:59 compute-1 nova_compute[238822]: 2025-09-30 18:44:59.933 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:00 compute-1 nova_compute[238822]: 2025-09-30 18:45:00.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:00 compute-1 ceph-mon[75484]: pgmap v1997: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:45:00 compute-1 sudo[300676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:45:00 compute-1 sudo[300676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:00 compute-1 sudo[300676]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:00.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:01 compute-1 nova_compute[238822]: 2025-09-30 18:45:01.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:01.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:02 compute-1 ovn_controller[135204]: 2025-09-30T18:45:02Z|00279|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Sep 30 18:45:02 compute-1 ceph-mon[75484]: pgmap v1998: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:45:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1412715386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:02.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:03.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:04 compute-1 nova_compute[238822]: 2025-09-30 18:45:04.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:04 compute-1 ceph-mon[75484]: pgmap v1999: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:45:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:04.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:05 compute-1 nova_compute[238822]: 2025-09-30 18:45:05.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:05 compute-1 podman[249638]: time="2025-09-30T18:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:45:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:45:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8359 "" "Go-http-client/1.1"
Sep 30 18:45:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:05.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:06 compute-1 nova_compute[238822]: 2025-09-30 18:45:06.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:06 compute-1 ceph-mon[75484]: pgmap v2000: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:45:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:06.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:07 compute-1 nova_compute[238822]: 2025-09-30 18:45:07.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:07 compute-1 nova_compute[238822]: 2025-09-30 18:45:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:07 compute-1 nova_compute[238822]: 2025-09-30 18:45:07.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:45:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:07 compute-1 nova_compute[238822]: 2025-09-30 18:45:07.566 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:45:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:07.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:08 compute-1 ceph-mon[75484]: pgmap v2001: 353 pgs: 353 active+clean; 41 MiB data, 356 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:45:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:08.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:09 compute-1 nova_compute[238822]: 2025-09-30 18:45:09.072 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3958410412' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:45:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2119052121' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:45:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:09.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:10 compute-1 nova_compute[238822]: 2025-09-30 18:45:10.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:10 compute-1 podman[300714]: 2025-09-30 18:45:10.567991719 +0000 UTC m=+0.100149033 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:45:10 compute-1 podman[300713]: 2025-09-30 18:45:10.607404878 +0000 UTC m=+0.144836983 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 18:45:10 compute-1 sshd-session[300709]: Invalid user user from 192.210.160.141 port 35482
Sep 30 18:45:10 compute-1 sshd-session[300709]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:45:10 compute-1 sshd-session[300709]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:45:10 compute-1 ceph-mon[75484]: pgmap v2002: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:45:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:10.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:11 compute-1 nova_compute[238822]: 2025-09-30 18:45:11.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:11.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:12 compute-1 sshd-session[300709]: Failed password for invalid user user from 192.210.160.141 port 35482 ssh2
Sep 30 18:45:12 compute-1 ceph-mon[75484]: pgmap v2003: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:45:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:12.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:12 compute-1 sshd-session[300709]: Connection closed by invalid user user 192.210.160.141 port 35482 [preauth]
Sep 30 18:45:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:13.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:13 compute-1 ceph-mon[75484]: pgmap v2004: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:45:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:14.049 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:45:14 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:14.051 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:45:14 compute-1 nova_compute[238822]: 2025-09-30 18:45:14.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:14 compute-1 podman[300768]: 2025-09-30 18:45:14.257083046 +0000 UTC m=+0.093208896 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 18:45:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:15 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:15.055 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:45:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:15 compute-1 nova_compute[238822]: 2025-09-30 18:45:15.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:15.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:16 compute-1 ceph-mon[75484]: pgmap v2005: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:45:16 compute-1 nova_compute[238822]: 2025-09-30 18:45:16.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:16.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:17 compute-1 sudo[300791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:45:17 compute-1 sudo[300791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:17 compute-1 sudo[300791]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:17 compute-1 sudo[300816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:45:17 compute-1 sudo[300816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:17.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:17 compute-1 sudo[300816]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:18 compute-1 ceph-mon[75484]: pgmap v2006: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:45:18 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:45:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: ERROR   18:45:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: ERROR   18:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: ERROR   18:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: ERROR   18:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: ERROR   18:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:45:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:45:19 compute-1 ceph-mon[75484]: pgmap v2007: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 2.0 MiB/s wr, 31 op/s
Sep 30 18:45:19 compute-1 ceph-mon[75484]: pgmap v2008: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 362 B/s rd, 0 op/s
Sep 30 18:45:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:19.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:20 compute-1 nova_compute[238822]: 2025-09-30 18:45:20.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:20 compute-1 sudo[300876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:45:20 compute-1 sudo[300876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:20 compute-1 sudo[300876]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:20.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:21 compute-1 ceph-mon[75484]: pgmap v2009: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 2.7 MiB/s rd, 19 KiB/s wr, 105 op/s
Sep 30 18:45:21 compute-1 nova_compute[238822]: 2025-09-30 18:45:21.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:21.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:45:22 compute-1 podman[300903]: 2025-09-30 18:45:22.566890376 +0000 UTC m=+0.102899397 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:45:22 compute-1 podman[300904]: 2025-09-30 18:45:22.586449112 +0000 UTC m=+0.118334632 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:45:22 compute-1 podman[300905]: 2025-09-30 18:45:22.602104242 +0000 UTC m=+0.126342926 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:45:22 compute-1 sudo[300964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:45:22 compute-1 sudo[300964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:22 compute-1 sudo[300964]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:23 compute-1 ceph-mon[75484]: pgmap v2010: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 2.7 MiB/s rd, 19 KiB/s wr, 104 op/s
Sep 30 18:45:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2493896786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:45:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:45:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:23.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:24.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:25 compute-1 nova_compute[238822]: 2025-09-30 18:45:25.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:25 compute-1 ceph-mon[75484]: pgmap v2011: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 2.7 MiB/s rd, 19 KiB/s wr, 105 op/s
Sep 30 18:45:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:25.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:26 compute-1 nova_compute[238822]: 2025-09-30 18:45:26.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:26 compute-1 nova_compute[238822]: 2025-09-30 18:45:26.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:26 compute-1 nova_compute[238822]: 2025-09-30 18:45:26.583 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:45:26 compute-1 nova_compute[238822]: 2025-09-30 18:45:26.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:26.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:27 compute-1 ceph-mon[75484]: pgmap v2012: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 2.7 MiB/s rd, 19 KiB/s wr, 105 op/s
Sep 30 18:45:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:27.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:28.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:29 compute-1 ceph-mon[75484]: pgmap v2013: 353 pgs: 353 active+clean; 88 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 89 op/s
Sep 30 18:45:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:29.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:30 compute-1 nova_compute[238822]: 2025-09-30 18:45:30.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:30 compute-1 sshd-session[300997]: Invalid user infra from 103.153.190.105 port 45729
Sep 30 18:45:30 compute-1 sshd-session[300997]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:45:30 compute-1 sshd-session[300997]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:45:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:30.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:31 compute-1 nova_compute[238822]: 2025-09-30 18:45:31.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:31 compute-1 ceph-mon[75484]: pgmap v2014: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 165 op/s
Sep 30 18:45:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:31.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:32.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:33 compute-1 sshd-session[300997]: Failed password for invalid user infra from 103.153.190.105 port 45729 ssh2
Sep 30 18:45:33 compute-1 ceph-mon[75484]: pgmap v2015: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:45:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3814742906' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:45:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/336090798' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:45:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:33.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:34 compute-1 sshd-session[300997]: Received disconnect from 103.153.190.105 port 45729:11: Bye Bye [preauth]
Sep 30 18:45:34 compute-1 sshd-session[300997]: Disconnected from invalid user infra 103.153.190.105 port 45729 [preauth]
Sep 30 18:45:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:34.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:35 compute-1 nova_compute[238822]: 2025-09-30 18:45:35.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:35 compute-1 podman[249638]: time="2025-09-30T18:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:45:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:45:35 compute-1 ceph-mon[75484]: pgmap v2016: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:45:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8374 "" "Go-http-client/1.1"
Sep 30 18:45:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:35.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:35 compute-1 unix_chkpwd[301007]: password check failed for user (root)
Sep 30 18:45:35 compute-1 sshd-session[301004]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:36 compute-1 nova_compute[238822]: 2025-09-30 18:45:36.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1833220988' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:45:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1833220988' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:45:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:36.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:37 compute-1 ceph-mon[75484]: pgmap v2017: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:45:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:45:37 compute-1 sshd-session[301004]: Failed password for root from 192.210.160.141 port 52610 ssh2
Sep 30 18:45:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:37.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:38 compute-1 ceph-mon[75484]: pgmap v2018: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:45:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:39 compute-1 sshd-session[301004]: Connection closed by authenticating user root 192.210.160.141 port 52610 [preauth]
Sep 30 18:45:39 compute-1 unix_chkpwd[301014]: password check failed for user (root)
Sep 30 18:45:39 compute-1 sshd-session[301011]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17  user=root
Sep 30 18:45:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:39.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:40 compute-1 nova_compute[238822]: 2025-09-30 18:45:40.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:40 compute-1 ceph-mon[75484]: pgmap v2019: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 350 KiB/s rd, 3.9 MiB/s wr, 101 op/s
Sep 30 18:45:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:41 compute-1 sudo[301017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:45:41 compute-1 sudo[301017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:45:41 compute-1 sudo[301017]: pam_unix(sudo:session): session closed for user root
Sep 30 18:45:41 compute-1 podman[301042]: 2025-09-30 18:45:41.094214864 +0000 UTC m=+0.055245586 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:45:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:41 compute-1 podman[301041]: 2025-09-30 18:45:41.139209853 +0000 UTC m=+0.115559317 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:45:41 compute-1 sshd-session[301011]: Failed password for root from 161.132.50.17 port 53340 ssh2
Sep 30 18:45:41 compute-1 nova_compute[238822]: 2025-09-30 18:45:41.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:41.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:42 compute-1 sshd-session[301011]: Received disconnect from 161.132.50.17 port 53340:11: Bye Bye [preauth]
Sep 30 18:45:42 compute-1 sshd-session[301011]: Disconnected from authenticating user root 161.132.50.17 port 53340 [preauth]
Sep 30 18:45:42 compute-1 ceph-mon[75484]: pgmap v2020: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 26 KiB/s wr, 10 op/s
Sep 30 18:45:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:42.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:43.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:44 compute-1 podman[301100]: 2025-09-30 18:45:44.552211939 +0000 UTC m=+0.092072475 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 18:45:44 compute-1 ceph-mon[75484]: pgmap v2021: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:45:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:44.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:45 compute-1 sshd-session[301095]: Invalid user titu from 14.103.105.56 port 61180
Sep 30 18:45:45 compute-1 sshd-session[301095]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:45:45 compute-1 sshd-session[301095]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.103.105.56
Sep 30 18:45:45 compute-1 nova_compute[238822]: 2025-09-30 18:45:45.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:45.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:46 compute-1 nova_compute[238822]: 2025-09-30 18:45:46.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:46 compute-1 ceph-mon[75484]: pgmap v2022: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:45:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:47.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:47 compute-1 sshd-session[301095]: Failed password for invalid user titu from 14.103.105.56 port 61180 ssh2
Sep 30 18:45:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:47.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:48 compute-1 sshd-session[301095]: Received disconnect from 14.103.105.56 port 61180:11: Bye Bye [preauth]
Sep 30 18:45:48 compute-1 sshd-session[301095]: Disconnected from invalid user titu 14.103.105.56 port 61180 [preauth]
Sep 30 18:45:48 compute-1 ceph-mon[75484]: pgmap v2023: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:45:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:49.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: ERROR   18:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: ERROR   18:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: ERROR   18:45:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: ERROR   18:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: ERROR   18:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:45:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:45:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:49.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:49 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 18:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:50 compute-1 nova_compute[238822]: 2025-09-30 18:45:50.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:50 compute-1 sshd-session[301127]: Invalid user asag from 8.243.64.201 port 39234
Sep 30 18:45:50 compute-1 sshd-session[301127]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:45:50 compute-1 sshd-session[301127]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:45:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:51.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:51 compute-1 ceph-mon[75484]: pgmap v2024: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 74 op/s
Sep 30 18:45:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:51 compute-1 nova_compute[238822]: 2025-09-30 18:45:51.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:51.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:52 compute-1 sshd-session[301127]: Failed password for invalid user asag from 8.243.64.201 port 39234 ssh2
Sep 30 18:45:52 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:45:52 compute-1 podman[301132]: 2025-09-30 18:45:52.88253918 +0000 UTC m=+0.091391258 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:45:52 compute-1 podman[301134]: 2025-09-30 18:45:52.898233511 +0000 UTC m=+0.094195522 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest)
Sep 30 18:45:52 compute-1 podman[301133]: 2025-09-30 18:45:52.898525039 +0000 UTC m=+0.099973148 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 18:45:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:53.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:53 compute-1 ceph-mon[75484]: pgmap v2025: 353 pgs: 353 active+clean; 167 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 64 op/s
Sep 30 18:45:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:45:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:53.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:54 compute-1 sshd-session[301127]: Received disconnect from 8.243.64.201 port 39234:11: Bye Bye [preauth]
Sep 30 18:45:54 compute-1 sshd-session[301127]: Disconnected from invalid user asag 8.243.64.201 port 39234 [preauth]
Sep 30 18:45:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:54.422 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:45:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:54.422 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:45:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:45:54.423 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:45:54 compute-1 nova_compute[238822]: 2025-09-30 18:45:54.580 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:45:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:55.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:45:55 compute-1 ceph-mon[75484]: pgmap v2026: 353 pgs: 353 active+clean; 121 MiB data, 467 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 156 op/s
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.093 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.093 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.094 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:45:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.608 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.609 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.609 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.609 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:45:55 compute-1 nova_compute[238822]: 2025-09-30 18:45:55.610 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:45:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:55.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:55 compute-1 sshd-session[301190]: Invalid user old from 49.49.32.245 port 37730
Sep 30 18:45:56 compute-1 sshd-session[301190]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:45:56 compute-1 sshd-session[301190]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:45:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:45:56 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4265865493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.096 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.326 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.329 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.358 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.359 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=39.94667053222656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.360 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.360 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:45:56 compute-1 nova_compute[238822]: 2025-09-30 18:45:56.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:45:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:57.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:57 compute-1 ceph-mon[75484]: pgmap v2027: 353 pgs: 353 active+clean; 121 MiB data, 467 MiB used, 40 GiB / 40 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Sep 30 18:45:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4265865493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/258217513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.477 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.477 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:45:56 up  4:23,  0 user,  load average: 0.39, 0.46, 0.53\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.525 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.570 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.570 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.589 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.613 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:45:57 compute-1 nova_compute[238822]: 2025-09-30 18:45:57.628 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:45:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:57.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:57 compute-1 sshd-session[301190]: Failed password for invalid user old from 49.49.32.245 port 37730 ssh2
Sep 30 18:45:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:45:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4201903888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4248379773' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:45:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4248379773' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:45:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1949252793' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:58 compute-1 nova_compute[238822]: 2025-09-30 18:45:58.092 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:45:58 compute-1 nova_compute[238822]: 2025-09-30 18:45:58.102 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:58 compute-1 nova_compute[238822]: 2025-09-30 18:45:58.615 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:45:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:45:59.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:59 compute-1 ceph-mon[75484]: pgmap v2028: 353 pgs: 353 active+clean; 121 MiB data, 467 MiB used, 40 GiB / 40 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Sep 30 18:45:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4201903888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:45:59 compute-1 nova_compute[238822]: 2025-09-30 18:45:59.127 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:45:59 compute-1 nova_compute[238822]: 2025-09-30 18:45:59.128 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.767s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:45:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:45:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:45:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:45:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:45:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:45:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:45:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:45:59.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:45:59 compute-1 sshd-session[301190]: Received disconnect from 49.49.32.245 port 37730:11: Bye Bye [preauth]
Sep 30 18:45:59 compute-1 sshd-session[301190]: Disconnected from invalid user old 49.49.32.245 port 37730 [preauth]
Sep 30 18:46:00 compute-1 nova_compute[238822]: 2025-09-30 18:46:00.092 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:00 compute-1 nova_compute[238822]: 2025-09-30 18:46:00.093 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:00 compute-1 nova_compute[238822]: 2025-09-30 18:46:00.093 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:00 compute-1 nova_compute[238822]: 2025-09-30 18:46:00.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:01.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:01 compute-1 ceph-mon[75484]: pgmap v2029: 353 pgs: 353 active+clean; 121 MiB data, 424 MiB used, 40 GiB / 40 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Sep 30 18:46:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/504335263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:01 compute-1 sudo[301244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:46:01 compute-1 sudo[301244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:01 compute-1 sudo[301244]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:01 compute-1 nova_compute[238822]: 2025-09-30 18:46:01.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:01.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:03.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:03 compute-1 ceph-mon[75484]: pgmap v2030: 353 pgs: 353 active+clean; 121 MiB data, 424 MiB used, 40 GiB / 40 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Sep 30 18:46:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:03.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:04 compute-1 unix_chkpwd[301275]: password check failed for user (root)
Sep 30 18:46:04 compute-1 sshd-session[301270]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:46:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:05.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:05 compute-1 nova_compute[238822]: 2025-09-30 18:46:05.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:05 compute-1 ceph-mon[75484]: pgmap v2031: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 365 KiB/s rd, 2.1 MiB/s wr, 120 op/s
Sep 30 18:46:05 compute-1 nova_compute[238822]: 2025-09-30 18:46:05.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:05 compute-1 podman[249638]: time="2025-09-30T18:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:46:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:46:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:46:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:05.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/783028067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:06 compute-1 sshd-session[301270]: Failed password for root from 192.210.160.141 port 36548 ssh2
Sep 30 18:46:06 compute-1 nova_compute[238822]: 2025-09-30 18:46:06.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:07.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:07 compute-1 ceph-mon[75484]: pgmap v2032: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:46:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:07.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:08 compute-1 nova_compute[238822]: 2025-09-30 18:46:08.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:08 compute-1 sshd-session[301270]: Connection closed by authenticating user root 192.210.160.141 port 36548 [preauth]
Sep 30 18:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:46:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:09.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:09 compute-1 ceph-mon[75484]: pgmap v2033: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:46:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:09.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:10 compute-1 nova_compute[238822]: 2025-09-30 18:46:10.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:10 compute-1 nova_compute[238822]: 2025-09-30 18:46:10.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:46:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:11.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:46:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:11 compute-1 ceph-mon[75484]: pgmap v2034: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 3.5 KiB/s wr, 28 op/s
Sep 30 18:46:11 compute-1 sshd-session[301280]: Invalid user mysql from 185.156.73.233 port 43654
Sep 30 18:46:11 compute-1 sshd-session[301280]: Failed none for invalid user mysql from 185.156.73.233 port 43654 ssh2
Sep 30 18:46:11 compute-1 podman[301285]: 2025-09-30 18:46:11.569003694 +0000 UTC m=+0.100030276 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:46:11 compute-1 podman[301284]: 2025-09-30 18:46:11.608074793 +0000 UTC m=+0.148217319 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:46:11 compute-1 sshd-session[301280]: Connection closed by invalid user mysql 185.156.73.233 port 43654 [preauth]
Sep 30 18:46:11 compute-1 nova_compute[238822]: 2025-09-30 18:46:11.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:11.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:13.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:13 compute-1 ceph-mon[75484]: pgmap v2035: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:46:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:13.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:15.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:15 compute-1 ceph-mon[75484]: pgmap v2036: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:46:15 compute-1 nova_compute[238822]: 2025-09-30 18:46:15.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:15 compute-1 podman[301340]: 2025-09-30 18:46:15.551207256 +0000 UTC m=+0.085506086 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 18:46:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:15.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:16 compute-1 nova_compute[238822]: 2025-09-30 18:46:16.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:17.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:17 compute-1 ceph-mon[75484]: pgmap v2037: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:17.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:19.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:19 compute-1 ceph-mon[75484]: pgmap v2038: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: ERROR   18:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: ERROR   18:46:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: ERROR   18:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: ERROR   18:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: ERROR   18:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:46:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:46:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:19.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:20 compute-1 nova_compute[238822]: 2025-09-30 18:46:20.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:21.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:21 compute-1 sudo[301366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:46:21 compute-1 sudo[301366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:21 compute-1 sudo[301366]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:21 compute-1 ceph-mon[75484]: pgmap v2039: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:46:21 compute-1 nova_compute[238822]: 2025-09-30 18:46:21.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:21.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:23 compute-1 sudo[301393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:46:23 compute-1 sudo[301393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:23.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:23 compute-1 sudo[301393]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:23 compute-1 podman[301419]: 2025-09-30 18:46:23.143759829 +0000 UTC m=+0.069540068 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:46:23 compute-1 podman[301418]: 2025-09-30 18:46:23.143789689 +0000 UTC m=+0.087618422 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Sep 30 18:46:23 compute-1 podman[301417]: 2025-09-30 18:46:23.144036386 +0000 UTC m=+0.086743309 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:46:23 compute-1 sudo[301432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:46:23 compute-1 sudo[301432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:23 compute-1 ceph-mon[75484]: pgmap v2040: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:46:23 compute-1 sudo[301432]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:23.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:46:24 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:46:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:25.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:25 compute-1 nova_compute[238822]: 2025-09-30 18:46:25.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:25 compute-1 ceph-mon[75484]: pgmap v2041: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 776 B/s rd, 0 op/s
Sep 30 18:46:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:25.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:26 compute-1 nova_compute[238822]: 2025-09-30 18:46:26.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:26.293 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:46:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:26.296 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:46:26 compute-1 nova_compute[238822]: 2025-09-30 18:46:26.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:27.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:27 compute-1 ceph-mon[75484]: pgmap v2042: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 517 B/s rd, 0 op/s
Sep 30 18:46:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:27.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:29.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:29 compute-1 sudo[301543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:46:29 compute-1 sudo[301543]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:29 compute-1 sudo[301543]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:29 compute-1 ceph-mon[75484]: pgmap v2043: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 517 B/s rd, 0 op/s
Sep 30 18:46:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:46:29 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:46:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:29.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:30 compute-1 nova_compute[238822]: 2025-09-30 18:46:30.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:31.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:31 compute-1 ceph-mon[75484]: pgmap v2044: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 776 B/s rd, 0 op/s
Sep 30 18:46:31 compute-1 sshd-session[301542]: Invalid user iptv from 192.210.160.141 port 34072
Sep 30 18:46:31 compute-1 sshd-session[301542]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:46:31 compute-1 sshd-session[301542]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:46:31 compute-1 nova_compute[238822]: 2025-09-30 18:46:31.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:31.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:33.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:33 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:33.300 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:46:33 compute-1 ceph-mon[75484]: pgmap v2045: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 517 B/s rd, 0 op/s
Sep 30 18:46:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:33.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:34 compute-1 sshd-session[301542]: Failed password for invalid user iptv from 192.210.160.141 port 34072 ssh2
Sep 30 18:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:35.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:35.222 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:a8:99 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67cbb3b670e445a4b97abcc92749d126', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=862fbe9e-132a-4b8a-83f6-7b020c6192ad) old=Port_Binding(mac=['fa:16:3e:37:a8:99'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67cbb3b670e445a4b97abcc92749d126', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:46:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:35.223 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 862fbe9e-132a-4b8a-83f6-7b020c6192ad in datapath f4658d55-a8f9-48f1-846d-61df3d830821 updated
Sep 30 18:46:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:35.224 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4658d55-a8f9-48f1-846d-61df3d830821, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:46:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:35.228 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[01cfa4c2-c9b3-4bc8-837a-12d13d25c726]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:46:35 compute-1 nova_compute[238822]: 2025-09-30 18:46:35.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:35 compute-1 podman[249638]: time="2025-09-30T18:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:46:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:46:35 compute-1 ceph-mon[75484]: pgmap v2046: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 776 B/s rd, 0 op/s
Sep 30 18:46:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8374 "" "Go-http-client/1.1"
Sep 30 18:46:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2367675944' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:46:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2367675944' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:46:36 compute-1 sshd-session[301542]: Connection closed by invalid user iptv 192.210.160.141 port 34072 [preauth]
Sep 30 18:46:36 compute-1 nova_compute[238822]: 2025-09-30 18:46:36.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:37.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:37 compute-1 ceph-mon[75484]: pgmap v2047: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:46:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:39.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:39 compute-1 ceph-mon[75484]: pgmap v2048: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:39.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:40 compute-1 nova_compute[238822]: 2025-09-30 18:46:40.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:46:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:41.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:46:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:41 compute-1 sudo[301581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:46:41 compute-1 sudo[301581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:46:41 compute-1 sudo[301581]: pam_unix(sudo:session): session closed for user root
Sep 30 18:46:41 compute-1 ceph-mon[75484]: pgmap v2049: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:46:41 compute-1 nova_compute[238822]: 2025-09-30 18:46:41.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:41.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:42 compute-1 podman[301608]: 2025-09-30 18:46:42.554644816 +0000 UTC m=+0.085846065 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:46:42 compute-1 podman[301607]: 2025-09-30 18:46:42.598656058 +0000 UTC m=+0.132632951 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:46:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:43.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:43 compute-1 ceph-mon[75484]: pgmap v2050: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:46:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:43.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:44.153 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:ac:29 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a549bb-c1b5-46c1-833a-6afe8a0d1bab, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=93d705df-d94d-4266-a1a5-9ccb90896904) old=Port_Binding(mac=['fa:16:3e:57:ac:29'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:46:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:44.155 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 93d705df-d94d-4266-a1a5-9ccb90896904 in datapath d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db updated
Sep 30 18:46:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:44.156 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d3db2eea-4a5b-4481-bccc-d3aa8bd5a6db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:46:44 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:44.158 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa58d52-76d4-44d3-8462-0887652c67a9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:46:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:44 compute-1 ceph-mon[75484]: pgmap v2051: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:46:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:45.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:45 compute-1 nova_compute[238822]: 2025-09-30 18:46:45.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:45.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:46 compute-1 podman[301661]: 2025-09-30 18:46:46.565988551 +0000 UTC m=+0.106798928 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 18:46:46 compute-1 nova_compute[238822]: 2025-09-30 18:46:46.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:46 compute-1 ceph-mon[75484]: pgmap v2052: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:47.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:47.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:48 compute-1 ceph-mon[75484]: pgmap v2053: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:49.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: ERROR   18:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: ERROR   18:46:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: ERROR   18:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: ERROR   18:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: ERROR   18:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:46:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:46:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:49.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:50 compute-1 sshd-session[301683]: Invalid user dvs from 161.132.50.17 port 40468
Sep 30 18:46:50 compute-1 sshd-session[301683]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:46:50 compute-1 sshd-session[301683]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:46:50 compute-1 nova_compute[238822]: 2025-09-30 18:46:50.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:50 compute-1 ceph-mon[75484]: pgmap v2054: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:46:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:51.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:51 compute-1 nova_compute[238822]: 2025-09-30 18:46:51.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:46:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:51.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:52 compute-1 sshd-session[301683]: Failed password for invalid user dvs from 161.132.50.17 port 40468 ssh2
Sep 30 18:46:52 compute-1 ceph-mon[75484]: pgmap v2055: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:46:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:53.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:53 compute-1 podman[301689]: 2025-09-30 18:46:53.562335792 +0000 UTC m=+0.098376301 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 18:46:53 compute-1 podman[301690]: 2025-09-30 18:46:53.574864558 +0000 UTC m=+0.104075124 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Sep 30 18:46:53 compute-1 sshd-session[301683]: Received disconnect from 161.132.50.17 port 40468:11: Bye Bye [preauth]
Sep 30 18:46:53 compute-1 sshd-session[301683]: Disconnected from invalid user dvs 161.132.50.17 port 40468 [preauth]
Sep 30 18:46:53 compute-1 podman[301691]: 2025-09-30 18:46:53.592669326 +0000 UTC m=+0.119679663 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:46:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:53.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:54 compute-1 nova_compute[238822]: 2025-09-30 18:46:54.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:54 compute-1 nova_compute[238822]: 2025-09-30 18:46:54.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:54.425 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:46:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:54.428 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:46:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:46:54.429 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:46:54 compute-1 ceph-mon[75484]: pgmap v2056: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:46:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:55.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:55 compute-1 nova_compute[238822]: 2025-09-30 18:46:55.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:55.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.580 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:46:56 compute-1 nova_compute[238822]: 2025-09-30 18:46:56.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:46:56 compute-1 ceph-mon[75484]: pgmap v2057: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:46:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1258656081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.085 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:46:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:57.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.342 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.345 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.372 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.373 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4697MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.374 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:46:57 compute-1 nova_compute[238822]: 2025-09-30 18:46:57.375 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:46:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:46:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:57.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:46:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1258656081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4111823747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/402236633' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:46:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/402236633' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:58 compute-1 sshd-session[301780]: Invalid user user3 from 8.243.64.201 port 57688
Sep 30 18:46:58 compute-1 sshd-session[301780]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:46:58 compute-1 sshd-session[301780]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:46:58 compute-1 nova_compute[238822]: 2025-09-30 18:46:58.439 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:46:58 compute-1 nova_compute[238822]: 2025-09-30 18:46:58.439 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:46:57 up  4:24,  0 user,  load average: 0.30, 0.41, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:46:58 compute-1 nova_compute[238822]: 2025-09-30 18:46:58.452 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:46:58 compute-1 ceph-mon[75484]: pgmap v2058: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:46:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3173180938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:58 compute-1 nova_compute[238822]: 2025-09-30 18:46:58.954 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:46:58 compute-1 nova_compute[238822]: 2025-09-30 18:46:58.965 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:46:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:46:59.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:46:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:46:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:46:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:46:59 compute-1 nova_compute[238822]: 2025-09-30 18:46:59.476 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:46:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:46:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:46:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:46:59.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:46:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2989727732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:46:59 compute-1 nova_compute[238822]: 2025-09-30 18:46:59.990 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:46:59 compute-1 nova_compute[238822]: 2025-09-30 18:46:59.991 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.616s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:00 compute-1 unix_chkpwd[301809]: password check failed for user (root)
Sep 30 18:47:00 compute-1 sshd-session[301783]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:47:00 compute-1 nova_compute[238822]: 2025-09-30 18:47:00.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:00 compute-1 sshd-session[301780]: Failed password for invalid user user3 from 8.243.64.201 port 57688 ssh2
Sep 30 18:47:00 compute-1 ceph-mon[75484]: pgmap v2059: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:47:00 compute-1 nova_compute[238822]: 2025-09-30 18:47:00.991 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:00 compute-1 nova_compute[238822]: 2025-09-30 18:47:00.991 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:00 compute-1 nova_compute[238822]: 2025-09-30 18:47:00.992 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:01.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:01 compute-1 sudo[301811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:47:01 compute-1 sudo[301811]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:01 compute-1 sudo[301811]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:01.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:01 compute-1 nova_compute[238822]: 2025-09-30 18:47:01.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:02 compute-1 sshd-session[301783]: Failed password for root from 192.210.160.141 port 41904 ssh2
Sep 30 18:47:02 compute-1 sshd-session[301780]: Received disconnect from 8.243.64.201 port 57688:11: Bye Bye [preauth]
Sep 30 18:47:02 compute-1 sshd-session[301780]: Disconnected from invalid user user3 8.243.64.201 port 57688 [preauth]
Sep 30 18:47:02 compute-1 ceph-mon[75484]: pgmap v2060: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:47:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1250951644' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:03.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:03 compute-1 sshd-session[301783]: Connection closed by authenticating user root 192.210.160.141 port 41904 [preauth]
Sep 30 18:47:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:03.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:05 compute-1 ceph-mon[75484]: pgmap v2061: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:47:05 compute-1 nova_compute[238822]: 2025-09-30 18:47:05.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:05 compute-1 nova_compute[238822]: 2025-09-30 18:47:05.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:05 compute-1 podman[249638]: time="2025-09-30T18:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:47:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:47:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8363 "" "Go-http-client/1.1"
Sep 30 18:47:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:05.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:06 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Sep 30 18:47:06 compute-1 sshd-session[301840]: Invalid user k8s from 49.49.32.245 port 32936
Sep 30 18:47:06 compute-1 sshd-session[301840]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:47:06 compute-1 sshd-session[301840]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:47:06 compute-1 nova_compute[238822]: 2025-09-30 18:47:06.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:07 compute-1 ceph-mon[75484]: pgmap v2062: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:47:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:07.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:07.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:09 compute-1 ceph-mon[75484]: pgmap v2063: 353 pgs: 353 active+clean; 41 MiB data, 378 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:47:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:09.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:09 compute-1 sshd-session[301840]: Failed password for invalid user k8s from 49.49.32.245 port 32936 ssh2
Sep 30 18:47:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:09 compute-1 sshd-session[301840]: Received disconnect from 49.49.32.245 port 32936:11: Bye Bye [preauth]
Sep 30 18:47:09 compute-1 sshd-session[301840]: Disconnected from invalid user k8s 49.49.32.245 port 32936 [preauth]
Sep 30 18:47:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:09.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:10 compute-1 nova_compute[238822]: 2025-09-30 18:47:10.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:10 compute-1 nova_compute[238822]: 2025-09-30 18:47:10.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:11 compute-1 ceph-mon[75484]: pgmap v2064: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:47:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/478917871' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1807568960' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:11.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:11.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:11 compute-1 nova_compute[238822]: 2025-09-30 18:47:11.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:12 compute-1 nova_compute[238822]: 2025-09-30 18:47:12.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:13 compute-1 ceph-mon[75484]: pgmap v2065: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:47:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:13.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:13 compute-1 podman[301851]: 2025-09-30 18:47:13.574629985 +0000 UTC m=+0.100882359 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:47:13 compute-1 podman[301850]: 2025-09-30 18:47:13.621966904 +0000 UTC m=+0.160168809 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:47:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:47:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:13.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:15 compute-1 ceph-mon[75484]: pgmap v2066: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:47:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:15 compute-1 nova_compute[238822]: 2025-09-30 18:47:15.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:15.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:16 compute-1 nova_compute[238822]: 2025-09-30 18:47:16.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:17 compute-1 ceph-mon[75484]: pgmap v2067: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:47:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:17 compute-1 podman[301904]: 2025-09-30 18:47:17.548365128 +0000 UTC m=+0.079509375 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 18:47:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:17.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:19 compute-1 ceph-mon[75484]: pgmap v2068: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:47:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:47:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:19.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:47:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: ERROR   18:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: ERROR   18:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: ERROR   18:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: ERROR   18:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: ERROR   18:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:47:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:47:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:19.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:20 compute-1 nova_compute[238822]: 2025-09-30 18:47:20.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:20 compute-1 nova_compute[238822]: 2025-09-30 18:47:20.857 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:20 compute-1 nova_compute[238822]: 2025-09-30 18:47:20.858 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:21 compute-1 ceph-mon[75484]: pgmap v2069: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:47:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:21.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.363 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:47:21 compute-1 sudo[301928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:47:21 compute-1 sudo[301928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:21 compute-1 sudo[301928]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:21.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.936 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.936 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.945 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.946 2 INFO nova.compute.claims [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:47:21 compute-1 nova_compute[238822]: 2025-09-30 18:47:21.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:23 compute-1 nova_compute[238822]: 2025-09-30 18:47:23.009 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:23.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:23 compute-1 ceph-mon[75484]: pgmap v2070: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:47:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:47:23 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:47:23 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/774896620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:23 compute-1 nova_compute[238822]: 2025-09-30 18:47:23.497 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:23 compute-1 nova_compute[238822]: 2025-09-30 18:47:23.507 2 DEBUG nova.compute.provider_tree [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:47:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:23.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:24 compute-1 nova_compute[238822]: 2025-09-30 18:47:24.020 2 DEBUG nova.scheduler.client.report [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:24 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/774896620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:24 compute-1 nova_compute[238822]: 2025-09-30 18:47:24.535 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.598s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:24 compute-1 nova_compute[238822]: 2025-09-30 18:47:24.535 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:47:24 compute-1 podman[301978]: 2025-09-30 18:47:24.546096155 +0000 UTC m=+0.088050504 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 18:47:24 compute-1 podman[301981]: 2025-09-30 18:47:24.558218291 +0000 UTC m=+0.085203038 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:47:24 compute-1 podman[301980]: 2025-09-30 18:47:24.570142011 +0000 UTC m=+0.103023696 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.049 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.049 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.050 2 WARNING neutronclient.v2_0.client [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.050 2 WARNING neutronclient.v2_0.client [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:25.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:25 compute-1 ceph-mon[75484]: pgmap v2071: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:25 compute-1 nova_compute[238822]: 2025-09-30 18:47:25.563 2 INFO nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:47:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:25.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:26 compute-1 nova_compute[238822]: 2025-09-30 18:47:26.076 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:26 compute-1 unix_chkpwd[302040]: password check failed for user (root)
Sep 30 18:47:26 compute-1 sshd-session[301979]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.109 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.111 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.112 2 INFO nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Creating image(s)
Sep 30 18:47:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.162 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:27.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:27 compute-1 ceph-mon[75484]: pgmap v2072: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.210 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.254 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.260 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.278 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Successfully created port: 04db1ec4-f9a9-4209-8ff9-65cb658acd60 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.355 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.357 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.358 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.358 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.394 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.399 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.715 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.804 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] resizing rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:47:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:27.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.937 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.939 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Ensure instance console log exists: /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.939 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.940 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:27 compute-1 nova_compute[238822]: 2025-09-30 18:47:27.941 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.147 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Successfully updated port: 04db1ec4-f9a9-4209-8ff9-65cb658acd60 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.220 2 DEBUG nova.compute.manager [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-changed-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.220 2 DEBUG nova.compute.manager [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Refreshing instance network info cache due to event network-changed-04db1ec4-f9a9-4209-8ff9-65cb658acd60. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.221 2 DEBUG oslo_concurrency.lockutils [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.221 2 DEBUG oslo_concurrency.lockutils [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.222 2 DEBUG nova.network.neutron [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Refreshing network info cache for port 04db1ec4-f9a9-4209-8ff9-65cb658acd60 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:47:28 compute-1 sshd-session[301979]: Failed password for root from 192.210.160.141 port 58476 ssh2
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.658 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:47:28 compute-1 nova_compute[238822]: 2025-09-30 18:47:28.730 2 WARNING neutronclient.v2_0.client [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:29 compute-1 nova_compute[238822]: 2025-09-30 18:47:29.023 2 DEBUG nova.network.neutron [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:47:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:29.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:29 compute-1 ceph-mon[75484]: pgmap v2073: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:47:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:29 compute-1 nova_compute[238822]: 2025-09-30 18:47:29.252 2 DEBUG nova.network.neutron [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:47:29 compute-1 sudo[302210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:47:29 compute-1 sudo[302210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:29 compute-1 sudo[302210]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:29 compute-1 nova_compute[238822]: 2025-09-30 18:47:29.763 2 DEBUG oslo_concurrency.lockutils [req-4bf28cec-9a7f-4119-8e4b-3b43d6f688c7 req-1ed24c57-41f2-4f60-94ca-784cffb5a248 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:47:29 compute-1 nova_compute[238822]: 2025-09-30 18:47:29.764 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquired lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:47:29 compute-1 nova_compute[238822]: 2025-09-30 18:47:29.764 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:47:29 compute-1 sudo[302235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:47:29 compute-1 sudo[302235]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:29.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:29 compute-1 sshd-session[301979]: Connection closed by authenticating user root 192.210.160.141 port 58476 [preauth]
Sep 30 18:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:30 compute-1 nova_compute[238822]: 2025-09-30 18:47:30.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:30 compute-1 sudo[302235]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:31 compute-1 nova_compute[238822]: 2025-09-30 18:47:31.029 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:47:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:31.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:31 compute-1 ceph-mon[75484]: pgmap v2074: 353 pgs: 353 active+clean; 167 MiB data, 456 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 163 op/s
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:47:31 compute-1 ceph-mon[75484]: pgmap v2075: 353 pgs: 353 active+clean; 167 MiB data, 456 MiB used, 40 GiB / 40 GiB avail; 375 KiB/s rd, 4.3 MiB/s wr, 98 op/s
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:47:31 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:47:31 compute-1 nova_compute[238822]: 2025-09-30 18:47:31.275 2 WARNING neutronclient.v2_0.client [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:31 compute-1 nova_compute[238822]: 2025-09-30 18:47:31.573 2 DEBUG nova.network.neutron [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Updating instance_info_cache with network_info: [{"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:47:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:31.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.080 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Releasing lock "refresh_cache-cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.081 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance network_info: |[{"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.085 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Start _get_guest_xml network_info=[{"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.091 2 WARNING nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.093 2 DEBUG nova.virt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1103328043', uuid='cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072'), owner=OwnerMeta(userid='f560266d133f4f1ba4a908e3cdcfa59d', username='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin', projectid='3359c464e0344756a39ce5c7088b9eba', projectname='tempest-TestExecuteZoneMigrationStrategy-613400940'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759258052.0934963) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.099 2 DEBUG nova.virt.libvirt.host [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.100 2 DEBUG nova.virt.libvirt.host [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.104 2 DEBUG nova.virt.libvirt.host [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.104 2 DEBUG nova.virt.libvirt.host [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.105 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.105 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.106 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.107 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.107 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.108 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.108 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.109 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.109 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.110 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.110 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.111 2 DEBUG nova.virt.hardware [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.116 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:47:32 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3701305756' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.627 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3701305756' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.658 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:32 compute-1 nova_compute[238822]: 2025-09-30 18:47:32.663 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:47:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2048103537' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.096 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.098 2 DEBUG nova.virt.libvirt.vif [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1103328043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1103328043',id=37,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-ca3ok9f2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:47:26Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.099 2 DEBUG nova.network.os_vif_util [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.099 2 DEBUG nova.network.os_vif_util [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.101 2 DEBUG nova.objects.instance [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'pci_devices' on Instance uuid cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:47:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:33.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.610 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <uuid>cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072</uuid>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <name>instance-00000025</name>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1103328043</nova:name>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:47:32</nova:creationTime>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:47:33 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:47:33 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:user uuid="f560266d133f4f1ba4a908e3cdcfa59d">tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin</nova:user>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:project uuid="3359c464e0344756a39ce5c7088b9eba">tempest-TestExecuteZoneMigrationStrategy-613400940</nova:project>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <nova:port uuid="04db1ec4-f9a9-4209-8ff9-65cb658acd60">
Sep 30 18:47:33 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <system>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="serial">cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="uuid">cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </system>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <os>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </os>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <features>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </features>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </source>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </source>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:47:33 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:6b:fa:19"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <target dev="tap04db1ec4-f9"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/console.log" append="off"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <video>
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </video>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:47:33 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:47:33 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:47:33 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:47:33 compute-1 nova_compute[238822]: </domain>
Sep 30 18:47:33 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.611 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Preparing to wait for external event network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.611 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.612 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.612 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.613 2 DEBUG nova.virt.libvirt.vif [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1103328043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1103328043',id=37,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-ca3ok9f2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:47:26Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.613 2 DEBUG nova.network.os_vif_util [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.614 2 DEBUG nova.network.os_vif_util [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.614 2 DEBUG os_vif [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.616 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.616 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.617 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a94149ad-078b-5b33-a607-8798dd8f4b7e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04db1ec4-f9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.628 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap04db1ec4-f9, col_values=(('qos', UUID('eceaf104-8f66-41bb-9f41-821d6dc0e91b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.628 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap04db1ec4-f9, col_values=(('external_ids', {'iface-id': '04db1ec4-f9a9-4209-8ff9-65cb658acd60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:fa:19', 'vm-uuid': 'cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:33 compute-1 NetworkManager[45549]: <info>  [1759258053.6336] manager: (tap04db1ec4-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:33 compute-1 nova_compute[238822]: 2025-09-30 18:47:33.645 2 INFO os_vif [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9')
Sep 30 18:47:33 compute-1 ceph-mon[75484]: pgmap v2076: 353 pgs: 353 active+clean; 167 MiB data, 456 MiB used, 40 GiB / 40 GiB avail; 375 KiB/s rd, 4.3 MiB/s wr, 98 op/s
Sep 30 18:47:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2048103537' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:47:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:33.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:35.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.213 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.214 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.214 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No VIF found with MAC fa:16:3e:6b:fa:19, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.214 2 INFO nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Using config drive
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.252 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:35 compute-1 sudo[302380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:47:35 compute-1 sudo[302380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:35 compute-1 sudo[302380]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:35 compute-1 podman[249638]: time="2025-09-30T18:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:47:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:47:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:47:35 compute-1 ceph-mon[75484]: pgmap v2077: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 375 KiB/s rd, 4.3 MiB/s wr, 99 op/s
Sep 30 18:47:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:47:35 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:47:35 compute-1 nova_compute[238822]: 2025-09-30 18:47:35.776 2 WARNING neutronclient.v2_0.client [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:35.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.136 2 INFO nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Creating config drive at /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.149 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpm_kuf31a execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.300 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpm_kuf31a" returned: 0 in 0.151s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.349 2 DEBUG nova.storage.rbd_utils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.355 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.561 2 DEBUG oslo_concurrency.processutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.562 2 INFO nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Deleting local config drive /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072/disk.config because it was imported into RBD.
Sep 30 18:47:36 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 18:47:36 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 18:47:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3922656506' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:47:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3922656506' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:47:36 compute-1 kernel: tap04db1ec4-f9: entered promiscuous mode
Sep 30 18:47:36 compute-1 NetworkManager[45549]: <info>  [1759258056.7505] manager: (tap04db1ec4-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Sep 30 18:47:36 compute-1 ovn_controller[135204]: 2025-09-30T18:47:36Z|00280|binding|INFO|Claiming lport 04db1ec4-f9a9-4209-8ff9-65cb658acd60 for this chassis.
Sep 30 18:47:36 compute-1 ovn_controller[135204]: 2025-09-30T18:47:36Z|00281|binding|INFO|04db1ec4-f9a9-4209-8ff9-65cb658acd60: Claiming fa:16:3e:6b:fa:19 10.100.0.8
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.812 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:fa:19 10.100.0.8'], port_security=['fa:16:3e:6b:fa:19 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=04db1ec4-f9a9-4209-8ff9-65cb658acd60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.813 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 04db1ec4-f9a9-4209-8ff9-65cb658acd60 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 bound to our chassis
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.818 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.839 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f705b7a5-cf8d-4a1c-925c-ad8ec4c4a8f2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.841 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4658d55-a1 in ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:47:36 compute-1 systemd-udevd[302477]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.844 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4658d55-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.845 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c1bb1e6b-5555-4b27-8385-7b1580a81cdc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.846 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[db5ec49a-d94a-434a-92fb-4d747e437f07]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 systemd-machined[195911]: New machine qemu-26-instance-00000025.
Sep 30 18:47:36 compute-1 NetworkManager[45549]: <info>  [1759258056.8647] device (tap04db1ec4-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:47:36 compute-1 NetworkManager[45549]: <info>  [1759258056.8665] device (tap04db1ec4-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:47:36 compute-1 systemd[1]: Started Virtual Machine qemu-26-instance-00000025.
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.882 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[e16e760c-b02d-4ba2-88a8-b49434425402]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 ovn_controller[135204]: 2025-09-30T18:47:36Z|00282|binding|INFO|Setting lport 04db1ec4-f9a9-4209-8ff9-65cb658acd60 ovn-installed in OVS
Sep 30 18:47:36 compute-1 ovn_controller[135204]: 2025-09-30T18:47:36Z|00283|binding|INFO|Setting lport 04db1ec4-f9a9-4209-8ff9-65cb658acd60 up in Southbound
Sep 30 18:47:36 compute-1 nova_compute[238822]: 2025-09-30 18:47:36.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.906 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[337ca053-3e13-40df-a250-da91e5233bcb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.958 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[53220357-2eb4-4c32-a157-19a9d2edbbc2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:36.967 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe760d2-8397-4682-b229-70236020894f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:36 compute-1 NetworkManager[45549]: <info>  [1759258056.9739] manager: (tapf4658d55-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.028 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[94be4637-8998-4b65-8839-d882c5df5eb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.034 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[060db776-71a2-4f0a-9d7c-4e18b5850a27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 NetworkManager[45549]: <info>  [1759258057.0798] device (tapf4658d55-a0): carrier: link connected
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.091 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce38552-f078-4383-b593-ce606d7d2d8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.124 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4efc44e8-bb01-4a90-8789-c5419612bd3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1589866, 'reachable_time': 15012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302510, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.155 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b2936323-7a0f-444c-959d-53cc8445789b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:a899'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1589866, 'tstamp': 1589866}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302511, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:37.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.188 2 DEBUG nova.compute.manager [req-97a0f5d2-6631-4d55-88fd-747d78711b84 req-44c66633-9eb6-436b-8d0e-f03987a5bc93 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.189 2 DEBUG oslo_concurrency.lockutils [req-97a0f5d2-6631-4d55-88fd-747d78711b84 req-44c66633-9eb6-436b-8d0e-f03987a5bc93 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.190 2 DEBUG oslo_concurrency.lockutils [req-97a0f5d2-6631-4d55-88fd-747d78711b84 req-44c66633-9eb6-436b-8d0e-f03987a5bc93 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.190 2 DEBUG oslo_concurrency.lockutils [req-97a0f5d2-6631-4d55-88fd-747d78711b84 req-44c66633-9eb6-436b-8d0e-f03987a5bc93 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.191 2 DEBUG nova.compute.manager [req-97a0f5d2-6631-4d55-88fd-747d78711b84 req-44c66633-9eb6-436b-8d0e-f03987a5bc93 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Processing event network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.190 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4f53199c-7de4-4a2d-a5b1-7d55244d4aae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1589866, 'reachable_time': 15012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302512, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.250 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6b884593-1163-4ee4-9013-5af4a32461ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.271 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.361 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6bee0f6b-9874-48e3-ad34-80b07c825d8f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.369 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.369 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.370 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:37 compute-1 kernel: tapf4658d55-a0: entered promiscuous mode
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:37 compute-1 NetworkManager[45549]: <info>  [1759258057.3738] manager: (tapf4658d55-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.377 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:37 compute-1 ovn_controller[135204]: 2025-09-30T18:47:37Z|00284|binding|INFO|Releasing lport 862fbe9e-132a-4b8a-83f6-7b020c6192ad from this chassis (sb_readonly=0)
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.382 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5431a7e1-1f97-431a-8ab9-7a93b7ee91f9]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.383 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.383 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.383 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f4658d55-a8f9-48f1-846d-61df3d830821 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.384 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.386 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[358cadbd-64d6-4e6f-a45f-225215e3df01]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.387 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.387 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c81bdf4a-f3bd-4755-b8e7-bbd2b48b5581]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.388 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:47:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:37.389 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'env', 'PROCESS_TAG=haproxy-f4658d55-a8f9-48f1-846d-61df3d830821', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4658d55-a8f9-48f1-846d-61df3d830821.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:47:37 compute-1 nova_compute[238822]: 2025-09-30 18:47:37.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:37 compute-1 ceph-mon[75484]: pgmap v2078: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 375 KiB/s rd, 4.3 MiB/s wr, 99 op/s
Sep 30 18:47:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:47:37 compute-1 podman[302586]: 2025-09-30 18:47:37.876257818 +0000 UTC m=+0.058649485 container create db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 18:47:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:37.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:37 compute-1 systemd[1]: Started libpod-conmon-db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781.scope.
Sep 30 18:47:37 compute-1 podman[302586]: 2025-09-30 18:47:37.845327268 +0000 UTC m=+0.027718965 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:47:37 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:47:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92514c849bc5c8e629ab50dcc4d98cf3ff384cabdbe94f09497b48bc5da688ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:47:38 compute-1 podman[302586]: 2025-09-30 18:47:38.004974933 +0000 UTC m=+0.187366680 container init db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.005 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.013 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:47:38 compute-1 podman[302586]: 2025-09-30 18:47:38.018796634 +0000 UTC m=+0.201188341 container start db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.018 2 INFO nova.virt.libvirt.driver [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance spawned successfully.
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.019 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:47:38 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [NOTICE]   (302606) : New worker (302608) forked
Sep 30 18:47:38 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [NOTICE]   (302606) : Loading success.
Sep 30 18:47:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:38.106 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:47:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:38.107 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.540 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.541 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.541 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.542 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.542 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.543 2 DEBUG nova.virt.libvirt.driver [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:47:38 compute-1 nova_compute[238822]: 2025-09-30 18:47:38.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:38 compute-1 ceph-mon[75484]: pgmap v2079: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 383 KiB/s rd, 4.3 MiB/s wr, 110 op/s
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.054 2 INFO nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Took 11.94 seconds to spawn the instance on the hypervisor.
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.055 2 DEBUG nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:47:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:39.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.260 2 DEBUG nova.compute.manager [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.261 2 DEBUG oslo_concurrency.lockutils [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.262 2 DEBUG oslo_concurrency.lockutils [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.262 2 DEBUG oslo_concurrency.lockutils [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.263 2 DEBUG nova.compute.manager [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] No waiting events found dispatching network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.263 2 WARNING nova.compute.manager [req-db1d4e24-17e4-414a-b31a-34779c234871 req-69fad614-faf7-488d-ace2-81e7392ef431 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received unexpected event network-vif-plugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 for instance with vm_state active and task_state None.
Sep 30 18:47:39 compute-1 nova_compute[238822]: 2025-09-30 18:47:39.607 2 INFO nova.compute.manager [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Took 17.73 seconds to build instance.
Sep 30 18:47:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:39.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:40 compute-1 nova_compute[238822]: 2025-09-30 18:47:40.115 2 DEBUG oslo_concurrency.lockutils [None req-5e63fd6f-26f3-4dba-8c3b-e482697c69b9 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.258s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:40 compute-1 nova_compute[238822]: 2025-09-30 18:47:40.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:41.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:41 compute-1 ceph-mon[75484]: pgmap v2080: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 8.6 KiB/s rd, 29 KiB/s wr, 11 op/s
Sep 30 18:47:41 compute-1 sudo[302622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:47:41 compute-1 sudo[302622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:47:41 compute-1 sudo[302622]: pam_unix(sudo:session): session closed for user root
Sep 30 18:47:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:41.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:43.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:43 compute-1 nova_compute[238822]: 2025-09-30 18:47:43.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:43 compute-1 ceph-mon[75484]: pgmap v2081: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 26 KiB/s wr, 10 op/s
Sep 30 18:47:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:43.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:44 compute-1 podman[302651]: 2025-09-30 18:47:44.553063673 +0000 UTC m=+0.086748079 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:47:44 compute-1 podman[302650]: 2025-09-30 18:47:44.616056324 +0000 UTC m=+0.154002185 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:47:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:45.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:45 compute-1 nova_compute[238822]: 2025-09-30 18:47:45.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:45 compute-1 ceph-mon[75484]: pgmap v2082: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 26 KiB/s wr, 74 op/s
Sep 30 18:47:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:47:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:45.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:47.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:47 compute-1 ceph-mon[75484]: pgmap v2083: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:47:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:47.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:48 compute-1 podman[302702]: 2025-09-30 18:47:48.580956872 +0000 UTC m=+0.115538742 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 18:47:48 compute-1 nova_compute[238822]: 2025-09-30 18:47:48.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:49.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: ERROR   18:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: ERROR   18:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: ERROR   18:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: ERROR   18:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: ERROR   18:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:47:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:47:49 compute-1 ovn_controller[135204]: 2025-09-30T18:47:49Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:fa:19 10.100.0.8
Sep 30 18:47:49 compute-1 ovn_controller[135204]: 2025-09-30T18:47:49Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:fa:19 10.100.0.8
Sep 30 18:47:49 compute-1 ceph-mon[75484]: pgmap v2084: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:47:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:47:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:49.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:50 compute-1 nova_compute[238822]: 2025-09-30 18:47:50.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:50 compute-1 ceph-mon[75484]: pgmap v2085: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 65 op/s
Sep 30 18:47:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:51.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:47:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:53 compute-1 nova_compute[238822]: 2025-09-30 18:47:53.203 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Creating tmpfile /var/lib/nova/instances/tmpfvxa598s to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:47:53 compute-1 nova_compute[238822]: 2025-09-30 18:47:53.204 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:53.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:53 compute-1 nova_compute[238822]: 2025-09-30 18:47:53.219 2 DEBUG nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfvxa598s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:47:53 compute-1 ceph-mon[75484]: pgmap v2086: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 65 op/s
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.435059) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073435198, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 2010, "num_deletes": 251, "total_data_size": 4940451, "memory_usage": 5010144, "flush_reason": "Manual Compaction"}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Sep 30 18:47:53 compute-1 sshd-session[302724]: Invalid user debian from 103.153.190.105 port 46376
Sep 30 18:47:53 compute-1 sshd-session[302724]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:47:53 compute-1 sshd-session[302724]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073460993, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 3197779, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55313, "largest_seqno": 57318, "table_properties": {"data_size": 3189620, "index_size": 4909, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17298, "raw_average_key_size": 20, "raw_value_size": 3173225, "raw_average_value_size": 3720, "num_data_blocks": 215, "num_entries": 853, "num_filter_entries": 853, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759257896, "oldest_key_time": 1759257896, "file_creation_time": 1759258073, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 25980 microseconds, and 16399 cpu microseconds.
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.461060) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 3197779 bytes OK
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.461095) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.463182) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.463205) EVENT_LOG_v1 {"time_micros": 1759258073463197, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.463230) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 4931374, prev total WAL file size 4931374, number of live WAL files 2.
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.465390) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(3122KB)], [114(10212KB)]
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073465429, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 13655079, "oldest_snapshot_seqno": -1}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 7460 keys, 11599963 bytes, temperature: kUnknown
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073521548, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 11599963, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11555290, "index_size": 24913, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18693, "raw_key_size": 197671, "raw_average_key_size": 26, "raw_value_size": 11426757, "raw_average_value_size": 1531, "num_data_blocks": 964, "num_entries": 7460, "num_filter_entries": 7460, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258073, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.522045) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 11599963 bytes
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.523458) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 242.5 rd, 206.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 10.0 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(7.9) write-amplify(3.6) OK, records in: 7976, records dropped: 516 output_compression: NoCompression
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.523490) EVENT_LOG_v1 {"time_micros": 1759258073523473, "job": 72, "event": "compaction_finished", "compaction_time_micros": 56311, "compaction_time_cpu_micros": 31551, "output_level": 6, "num_output_files": 1, "total_output_size": 11599963, "num_input_records": 7976, "num_output_records": 7460, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073524714, "job": 72, "event": "table_file_deletion", "file_number": 116}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258073528562, "job": 72, "event": "table_file_deletion", "file_number": 114}
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.465266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.528680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.528691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.528694) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.528698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:47:53.528701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:47:53 compute-1 nova_compute[238822]: 2025-09-30 18:47:53.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:53.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:54.430 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:54.430 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:47:54.430 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:54 compute-1 unix_chkpwd[302733]: password check failed for user (root)
Sep 30 18:47:54 compute-1 sshd-session[302727]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:47:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:55.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:55 compute-1 nova_compute[238822]: 2025-09-30 18:47:55.268 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:47:55 compute-1 nova_compute[238822]: 2025-09-30 18:47:55.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:55 compute-1 podman[302735]: 2025-09-30 18:47:55.563747559 +0000 UTC m=+0.094787325 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2)
Sep 30 18:47:55 compute-1 podman[302736]: 2025-09-30 18:47:55.580061297 +0000 UTC m=+0.107471375 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Sep 30 18:47:55 compute-1 podman[302737]: 2025-09-30 18:47:55.603160017 +0000 UTC m=+0.117237608 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Sep 30 18:47:55 compute-1 ceph-mon[75484]: pgmap v2087: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 125 op/s
Sep 30 18:47:55 compute-1 sshd-session[302724]: Failed password for invalid user debian from 103.153.190.105 port 46376 ssh2
Sep 30 18:47:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:55.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:56 compute-1 nova_compute[238822]: 2025-09-30 18:47:56.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:56 compute-1 nova_compute[238822]: 2025-09-30 18:47:56.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:56 compute-1 sshd-session[302727]: Failed password for root from 192.210.160.141 port 39580 ssh2
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:47:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:57.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.573 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:47:57 compute-1 nova_compute[238822]: 2025-09-30 18:47:57.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:57 compute-1 ceph-mon[75484]: pgmap v2088: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 289 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:47:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3877174777' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:47:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3877174777' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:47:57 compute-1 sshd-session[302727]: Connection closed by authenticating user root 192.210.160.141 port 39580 [preauth]
Sep 30 18:47:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:57.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:47:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2680587084' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:58 compute-1 nova_compute[238822]: 2025-09-30 18:47:58.037 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:58 compute-1 sshd-session[302724]: Received disconnect from 103.153.190.105 port 46376:11: Bye Bye [preauth]
Sep 30 18:47:58 compute-1 sshd-session[302724]: Disconnected from invalid user debian 103.153.190.105 port 46376 [preauth]
Sep 30 18:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:58 compute-1 nova_compute[238822]: 2025-09-30 18:47:58.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:47:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2680587084' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:59 compute-1 sshd-session[302701]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:47:59 compute-1 sshd-session[302701]: banner exchange: Connection from 110.42.70.108 port 59592: Connection timed out
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.088 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000025 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.089 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000025 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:47:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:47:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:47:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:47:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:47:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:47:59.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:47:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.313 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.315 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.345 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.346 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4447MB free_disk=39.90130615234375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.346 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.346 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:47:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:47:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4200.0 total, 600.0 interval
                                           Cumulative writes: 11K writes, 57K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s
                                           Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1506 writes, 7314 keys, 1506 commit groups, 1.0 writes per commit group, ingest: 16.53 MB, 0.03 MB/s
                                           Interval WAL: 1506 writes, 1506 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    129.0      0.59              0.31        36    0.016       0      0       0.0       0.0
                                             L6      1/0   11.06 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.2    173.8    149.4      2.66              1.42        35    0.076    220K    18K       0.0       0.0
                                            Sum      1/0   11.06 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.2    142.3    145.7      3.25              1.73        71    0.046    220K    18K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.5    138.2    141.3      0.50              0.27        10    0.050     39K   2570       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    173.8    149.4      2.66              1.42        35    0.076    220K    18K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    129.5      0.59              0.31        35    0.017       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.074, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.46 GB write, 0.11 MB/s write, 0.45 GB read, 0.11 MB/s read, 3.2 seconds
                                           Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 45.31 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000332 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(2517,43.77 MB,14.3992%) FilterBlock(71,618.05 KB,0.19854%) IndexBlock(71,956.88 KB,0.307384%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:47:59 compute-1 ceph-mon[75484]: pgmap v2089: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 289 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Sep 30 18:47:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2375457157' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:47:59 compute-1 nova_compute[238822]: 2025-09-30 18:47:59.842 2 DEBUG nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfvxa598s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='656a0137-3214-4992-a68a-cdbedf0336f6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:47:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:47:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:47:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:47:59.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:47:59 compute-1 sshd-session[302820]: Invalid user steam from 161.132.50.17 port 42128
Sep 30 18:47:59 compute-1 sshd-session[302820]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:47:59 compute-1 sshd-session[302820]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.369 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 656a0137-3214-4992-a68a-cdbedf0336f6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.860 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.860 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.860 2 DEBUG nova.network.neutron [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.880 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Updating resource usage from migration 0f90f65f-b699-4d58-94bd-ef4a6ad3b688
Sep 30 18:48:00 compute-1 nova_compute[238822]: 2025-09-30 18:48:00.880 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Starting to track incoming migration 0f90f65f-b699-4d58-94bd-ef4a6ad3b688 with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:48:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:01.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.368 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.432 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:48:01 compute-1 ceph-mon[75484]: pgmap v2090: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 286 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:48:01 compute-1 sudo[302824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:48:01 compute-1 sudo[302824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:01 compute-1 sudo[302824]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:01.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.941 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 656a0137-3214-4992-a68a-cdbedf0336f6 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.942 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.943 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:47:59 up  4:25,  0 user,  load average: 0.40, 0.41, 0.50\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3359c464e0344756a39ce5c7088b9eba': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:48:01 compute-1 nova_compute[238822]: 2025-09-30 18:48:01.998 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:48:02 compute-1 nova_compute[238822]: 2025-09-30 18:48:02.014 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:02 compute-1 sshd-session[302820]: Failed password for invalid user steam from 161.132.50.17 port 42128 ssh2
Sep 30 18:48:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:48:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2449857165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:02 compute-1 nova_compute[238822]: 2025-09-30 18:48:02.568 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:48:02 compute-1 nova_compute[238822]: 2025-09-30 18:48:02.577 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:48:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3397991696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2449857165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.086 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.121 2 DEBUG nova.network.neutron [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Updating instance_info_cache with network_info: [{"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:48:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:03.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.601 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.602 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.255s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.628 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.646 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfvxa598s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='656a0137-3214-4992-a68a-cdbedf0336f6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.647 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Creating instance directory: /var/lib/nova/instances/656a0137-3214-4992-a68a-cdbedf0336f6 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.648 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Ensure instance console log exists: /var/lib/nova/instances/656a0137-3214-4992-a68a-cdbedf0336f6/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.648 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.649 2 DEBUG nova.virt.libvirt.vif [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:46:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-752609519',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-752609519',id=36,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:47:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-9hll55u7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:47:16Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=656a0137-3214-4992-a68a-cdbedf0336f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.650 2 DEBUG nova.network.os_vif_util [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.651 2 DEBUG nova.network.os_vif_util [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.652 2 DEBUG os_vif [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.654 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5d2e0724-05f8-585c-95a8-128e7bd904b0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.706 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd728eab4-88, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd728eab4-88, col_values=(('qos', UUID('b4818e22-d37c-4de7-b17d-af613c865afe')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd728eab4-88, col_values=(('external_ids', {'iface-id': 'd728eab4-88db-4811-b199-c75155b08c82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:57:7e', 'vm-uuid': '656a0137-3214-4992-a68a-cdbedf0336f6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:03 compute-1 NetworkManager[45549]: <info>  [1759258083.7124] manager: (tapd728eab4-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.722 2 INFO os_vif [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88')
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.723 2 DEBUG nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.723 2 DEBUG nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfvxa598s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='656a0137-3214-4992-a68a-cdbedf0336f6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.724 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:03 compute-1 ceph-mon[75484]: pgmap v2091: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 286 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Sep 30 18:48:03 compute-1 nova_compute[238822]: 2025-09-30 18:48:03.841 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:03.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:04 compute-1 sshd-session[302820]: Received disconnect from 161.132.50.17 port 42128:11: Bye Bye [preauth]
Sep 30 18:48:04 compute-1 sshd-session[302820]: Disconnected from invalid user steam 161.132.50.17 port 42128 [preauth]
Sep 30 18:48:04 compute-1 sshd-session[302875]: Invalid user foundry from 8.243.64.201 port 53212
Sep 30 18:48:04 compute-1 sshd-session[302875]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:48:04 compute-1 sshd-session[302875]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:48:04 compute-1 nova_compute[238822]: 2025-09-30 18:48:04.565 2 DEBUG nova.network.neutron [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Port d728eab4-88db-4811-b199-c75155b08c82 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:48:04 compute-1 nova_compute[238822]: 2025-09-30 18:48:04.582 2 DEBUG nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfvxa598s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='656a0137-3214-4992-a68a-cdbedf0336f6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:48:04 compute-1 ceph-mon[75484]: pgmap v2092: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 286 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Sep 30 18:48:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:48:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:05.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:48:05 compute-1 nova_compute[238822]: 2025-09-30 18:48:05.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:05 compute-1 nova_compute[238822]: 2025-09-30 18:48:05.597 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:05 compute-1 podman[249638]: time="2025-09-30T18:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:48:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:48:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8832 "" "Go-http-client/1.1"
Sep 30 18:48:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:05.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:06 compute-1 sshd-session[302875]: Failed password for invalid user foundry from 8.243.64.201 port 53212 ssh2
Sep 30 18:48:06 compute-1 nova_compute[238822]: 2025-09-30 18:48:06.110 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:06 compute-1 nova_compute[238822]: 2025-09-30 18:48:06.111 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:06 compute-1 sshd-session[302875]: Received disconnect from 8.243.64.201 port 53212:11: Bye Bye [preauth]
Sep 30 18:48:06 compute-1 sshd-session[302875]: Disconnected from invalid user foundry 8.243.64.201 port 53212 [preauth]
Sep 30 18:48:06 compute-1 nova_compute[238822]: 2025-09-30 18:48:06.566 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:07.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:07 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 18:48:07 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 18:48:07 compute-1 ovn_controller[135204]: 2025-09-30T18:48:07Z|00285|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 18:48:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:48:07 compute-1 kernel: tapd728eab4-88: entered promiscuous mode
Sep 30 18:48:07 compute-1 NetworkManager[45549]: <info>  [1759258087.6028] manager: (tapd728eab4-88): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Sep 30 18:48:07 compute-1 ovn_controller[135204]: 2025-09-30T18:48:07Z|00286|binding|INFO|Claiming lport d728eab4-88db-4811-b199-c75155b08c82 for this additional chassis.
Sep 30 18:48:07 compute-1 ovn_controller[135204]: 2025-09-30T18:48:07Z|00287|binding|INFO|d728eab4-88db-4811-b199-c75155b08c82: Claiming fa:16:3e:b3:57:7e 10.100.0.14
Sep 30 18:48:07 compute-1 nova_compute[238822]: 2025-09-30 18:48:07.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.619 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:57:7e 10.100.0.14'], port_security=['fa:16:3e:b3:57:7e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '656a0137-3214-4992-a68a-cdbedf0336f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d728eab4-88db-4811-b199-c75155b08c82) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.620 144543 INFO neutron.agent.ovn.metadata.agent [-] Port d728eab4-88db-4811-b199-c75155b08c82 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.622 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:48:07 compute-1 ovn_controller[135204]: 2025-09-30T18:48:07Z|00288|binding|INFO|Setting lport d728eab4-88db-4811-b199-c75155b08c82 ovn-installed in OVS
Sep 30 18:48:07 compute-1 nova_compute[238822]: 2025-09-30 18:48:07.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:07 compute-1 nova_compute[238822]: 2025-09-30 18:48:07.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:07 compute-1 systemd-machined[195911]: New machine qemu-27-instance-00000024.
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.647 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c4d5e2-f846-40c1-9674-134ea4c4b656]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 systemd[1]: Started Virtual Machine qemu-27-instance-00000024.
Sep 30 18:48:07 compute-1 systemd-udevd[302915]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.698 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5b6b43-9ff0-48d3-9ae5-ed9dac8e84c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 ceph-mon[75484]: pgmap v2093: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 17 KiB/s wr, 1 op/s
Sep 30 18:48:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:48:07 compute-1 NetworkManager[45549]: <info>  [1759258087.7048] device (tapd728eab4-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:48:07 compute-1 NetworkManager[45549]: <info>  [1759258087.7061] device (tapd728eab4-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.709 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[814d6eed-2e3e-4176-9688-ee753e90f3ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.738 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[29eebb1f-8fff-47bd-aad6-5a0bc83df6c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.753 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c9af9be5-ab81-422c-a5c7-0eec874e86df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1589866, 'reachable_time': 15012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302925, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.773 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[099848be-2105-4540-af7a-4408e74b1918]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1589887, 'tstamp': 1589887}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302927, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1589892, 'tstamp': 1589892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302927, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.774 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:07 compute-1 nova_compute[238822]: 2025-09-30 18:48:07.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:07 compute-1 nova_compute[238822]: 2025-09-30 18:48:07.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.778 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.778 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.778 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.778 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:48:07 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:07.779 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[32a24dff-0a70-4953-a32e-1db1edc589e6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:07.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:08 compute-1 nova_compute[238822]: 2025-09-30 18:48:08.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:09.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:09 compute-1 ceph-mon[75484]: pgmap v2094: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 5.0 KiB/s rd, 18 KiB/s wr, 7 op/s
Sep 30 18:48:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:10 compute-1 nova_compute[238822]: 2025-09-30 18:48:10.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:10 compute-1 nova_compute[238822]: 2025-09-30 18:48:10.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:11.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:11 compute-1 ceph-mon[75484]: pgmap v2095: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 4.7 KiB/s wr, 5 op/s
Sep 30 18:48:11 compute-1 ovn_controller[135204]: 2025-09-30T18:48:11Z|00289|binding|INFO|Claiming lport d728eab4-88db-4811-b199-c75155b08c82 for this chassis.
Sep 30 18:48:11 compute-1 ovn_controller[135204]: 2025-09-30T18:48:11Z|00290|binding|INFO|d728eab4-88db-4811-b199-c75155b08c82: Claiming fa:16:3e:b3:57:7e 10.100.0.14
Sep 30 18:48:11 compute-1 ovn_controller[135204]: 2025-09-30T18:48:11Z|00291|binding|INFO|Setting lport d728eab4-88db-4811-b199-c75155b08c82 up in Southbound
Sep 30 18:48:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:11.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:12 compute-1 nova_compute[238822]: 2025-09-30 18:48:12.977 2 INFO nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Post operation of migration started
Sep 30 18:48:12 compute-1 nova_compute[238822]: 2025-09-30 18:48:12.978 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.093 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.094 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.185 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.185 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.186 2 DEBUG nova.network.neutron [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:48:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:13.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:13 compute-1 unix_chkpwd[302978]: password check failed for user (root)
Sep 30 18:48:13 compute-1 sshd-session[302974]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.698 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:13 compute-1 ceph-mon[75484]: pgmap v2096: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 4.7 KiB/s rd, 4.7 KiB/s wr, 5 op/s
Sep 30 18:48:13 compute-1 nova_compute[238822]: 2025-09-30 18:48:13.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:48:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:13.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:14 compute-1 nova_compute[238822]: 2025-09-30 18:48:14.476 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:14 compute-1 nova_compute[238822]: 2025-09-30 18:48:14.627 2 DEBUG nova.network.neutron [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Updating instance_info_cache with network_info: [{"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:48:14 compute-1 ceph-mon[75484]: pgmap v2097: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 4.7 KiB/s wr, 6 op/s
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.133 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-656a0137-3214-4992-a68a-cdbedf0336f6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:48:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:15 compute-1 sshd-session[302974]: Failed password for root from 49.49.32.245 port 56368 ssh2
Sep 30 18:48:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:48:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:15.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:15 compute-1 podman[302982]: 2025-09-30 18:48:15.583923101 +0000 UTC m=+0.112597033 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:48:15 compute-1 podman[302981]: 2025-09-30 18:48:15.632913046 +0000 UTC m=+0.163922240 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller)
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.656 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.657 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.658 2 DEBUG oslo_concurrency.lockutils [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:15 compute-1 nova_compute[238822]: 2025-09-30 18:48:15.665 2 INFO nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:48:15 compute-1 virtqemud[239124]: Domain id=27 name='instance-00000024' uuid=656a0137-3214-4992-a68a-cdbedf0336f6 is tainted: custom-monitor
Sep 30 18:48:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:15.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:16 compute-1 sshd-session[302974]: Received disconnect from 49.49.32.245 port 56368:11: Bye Bye [preauth]
Sep 30 18:48:16 compute-1 sshd-session[302974]: Disconnected from authenticating user root 49.49.32.245 port 56368 [preauth]
Sep 30 18:48:16 compute-1 nova_compute[238822]: 2025-09-30 18:48:16.677 2 INFO nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:48:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:17.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:17 compute-1 nova_compute[238822]: 2025-09-30 18:48:17.687 2 INFO nova.virt.libvirt.driver [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:48:17 compute-1 nova_compute[238822]: 2025-09-30 18:48:17.693 2 DEBUG nova.compute.manager [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:48:17 compute-1 ceph-mon[75484]: pgmap v2098: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 1.1 KiB/s wr, 6 op/s
Sep 30 18:48:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:17.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:18 compute-1 nova_compute[238822]: 2025-09-30 18:48:18.210 2 DEBUG nova.objects.instance [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:48:18 compute-1 nova_compute[238822]: 2025-09-30 18:48:18.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:19 compute-1 nova_compute[238822]: 2025-09-30 18:48:19.233 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:19.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:19 compute-1 nova_compute[238822]: 2025-09-30 18:48:19.328 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:19 compute-1 nova_compute[238822]: 2025-09-30 18:48:19.329 2 WARNING neutronclient.v2_0.client [None req-9c1f2a99-b127-49c4-a2e1-55cce65836c2 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: ERROR   18:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: ERROR   18:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: ERROR   18:48:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: ERROR   18:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: ERROR   18:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:48:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:48:19 compute-1 podman[303039]: 2025-09-30 18:48:19.557869173 +0000 UTC m=+0.094513308 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:48:19 compute-1 ceph-mon[75484]: pgmap v2099: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 5.2 KiB/s rd, 1.1 KiB/s wr, 6 op/s
Sep 30 18:48:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:19.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:20 compute-1 nova_compute[238822]: 2025-09-30 18:48:20.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:21.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:21 compute-1 ceph-mon[75484]: pgmap v2100: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:48:21 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2994145042' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:21.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:21 compute-1 sudo[303063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:48:21 compute-1 sudo[303063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:21 compute-1 sudo[303063]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:22 compute-1 sshd-session[303060]: Invalid user www-data from 192.210.160.141 port 51606
Sep 30 18:48:22 compute-1 sshd-session[303060]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:48:22 compute-1 sshd-session[303060]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:48:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:48:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4200.1 total, 600.0 interval
                                           Cumulative writes: 22K writes, 87K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.02 MB/s
                                           Cumulative WAL: 22K writes, 7266 syncs, 3.10 writes per sync, written: 0.08 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3577 writes, 13K keys, 3577 commit groups, 1.0 writes per commit group, ingest: 15.54 MB, 0.03 MB/s
                                           Interval WAL: 3577 writes, 1387 syncs, 2.58 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:48:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:23.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:23 compute-1 ceph-mon[75484]: pgmap v2101: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:48:23 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/997633198' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:23 compute-1 nova_compute[238822]: 2025-09-30 18:48:23.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:23.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:24 compute-1 sshd-session[303060]: Failed password for invalid user www-data from 192.210.160.141 port 51606 ssh2
Sep 30 18:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:25.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.474 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.475 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.475 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.475 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.476 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:25 compute-1 nova_compute[238822]: 2025-09-30 18:48:25.492 2 INFO nova.compute.manager [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Terminating instance
Sep 30 18:48:25 compute-1 ceph-mon[75484]: pgmap v2102: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:48:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:25.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.019 2 DEBUG nova.compute.manager [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:48:26 compute-1 kernel: tap04db1ec4-f9 (unregistering): left promiscuous mode
Sep 30 18:48:26 compute-1 NetworkManager[45549]: <info>  [1759258106.0868] device (tap04db1ec4-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:48:26 compute-1 ovn_controller[135204]: 2025-09-30T18:48:26Z|00292|binding|INFO|Releasing lport 04db1ec4-f9a9-4209-8ff9-65cb658acd60 from this chassis (sb_readonly=0)
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 ovn_controller[135204]: 2025-09-30T18:48:26Z|00293|binding|INFO|Setting lport 04db1ec4-f9a9-4209-8ff9-65cb658acd60 down in Southbound
Sep 30 18:48:26 compute-1 ovn_controller[135204]: 2025-09-30T18:48:26Z|00294|binding|INFO|Removing iface tap04db1ec4-f9 ovn-installed in OVS
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.173 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:fa:19 10.100.0.8'], port_security=['fa:16:3e:6b:fa:19 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=04db1ec4-f9a9-4209-8ff9-65cb658acd60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.174 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 04db1ec4-f9a9-4209-8ff9-65cb658acd60 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.177 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000025.scope: Deactivated successfully.
Sep 30 18:48:26 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000025.scope: Consumed 14.766s CPU time.
Sep 30 18:48:26 compute-1 systemd-machined[195911]: Machine qemu-26-instance-00000025 terminated.
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.218 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a4558c68-c965-4b22-a92e-876985c0d609]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.264 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0698eb-144b-4eae-8606-2b6fccad182a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.268 2 INFO nova.virt.libvirt.driver [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Instance destroyed successfully.
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.269 2 DEBUG nova.objects.instance [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'resources' on Instance uuid cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.269 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[edc6c9ad-4b42-423f-96cd-b485a33283dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 podman[303092]: 2025-09-30 18:48:26.291835322 +0000 UTC m=+0.166402158 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 18:48:26 compute-1 podman[303105]: 2025-09-30 18:48:26.296667551 +0000 UTC m=+0.119321193 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.301 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[51fa9618-50d7-4887-a168-3e6f464ee71b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.321 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d70dcca4-8d26-4b44-9221-914bc4f9a35e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1589866, 'reachable_time': 15012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303170, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 podman[303106]: 2025-09-30 18:48:26.324675383 +0000 UTC m=+0.146336719 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:48:26 compute-1 sshd-session[303060]: Connection closed by invalid user www-data 192.210.160.141 port 51606 [preauth]
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.342 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3ed5da-c1c6-43d0-ad6f-c8032c28c66e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1589887, 'tstamp': 1589887}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303172, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1589892, 'tstamp': 1589892}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303172, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.344 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.352 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.353 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.353 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.353 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:48:26 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:26.355 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[222db911-ecc2-463c-b515-fc510be2a1f3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:26 compute-1 ceph-mon[75484]: pgmap v2103: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.787 2 DEBUG nova.virt.libvirt.vif [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1103328043',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1103328043',id=37,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:47:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-ca3ok9f2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:47:39Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.788 2 DEBUG nova.network.os_vif_util [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "address": "fa:16:3e:6b:fa:19", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04db1ec4-f9", "ovs_interfaceid": "04db1ec4-f9a9-4209-8ff9-65cb658acd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.789 2 DEBUG nova.network.os_vif_util [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.790 2 DEBUG os_vif [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.796 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04db1ec4-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=eceaf104-8f66-41bb-9f41-821d6dc0e91b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:26 compute-1 nova_compute[238822]: 2025-09-30 18:48:26.809 2 INFO os_vif [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:fa:19,bridge_name='br-int',has_traffic_filtering=True,id=04db1ec4-f9a9-4209-8ff9-65cb658acd60,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04db1ec4-f9')
Sep 30 18:48:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.236 2 DEBUG nova.compute.manager [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.237 2 DEBUG oslo_concurrency.lockutils [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.237 2 DEBUG oslo_concurrency.lockutils [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.238 2 DEBUG oslo_concurrency.lockutils [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.238 2 DEBUG nova.compute.manager [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] No waiting events found dispatching network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.239 2 DEBUG nova.compute.manager [req-595e50ae-5864-4ced-9e8d-259cf69b6bc5 req-88f055f4-ff8e-4e6f-b74a-0c7bcbccbb4b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:48:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:27.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.303 2 INFO nova.virt.libvirt.driver [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Deleting instance files /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_del
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.304 2 INFO nova.virt.libvirt.driver [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Deletion of /var/lib/nova/instances/cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072_del complete
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.821 2 INFO nova.compute.manager [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Took 1.80 seconds to destroy the instance on the hypervisor.
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.822 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.822 2 DEBUG nova.compute.manager [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.822 2 DEBUG nova.network.neutron [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.823 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:27.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:27 compute-1 nova_compute[238822]: 2025-09-30 18:48:27.972 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:28 compute-1 nova_compute[238822]: 2025-09-30 18:48:28.942 2 DEBUG nova.compute.manager [req-41ac338f-de6e-4ca6-9520-893ac2b1c7b3 req-9961241e-3263-49bb-b7ec-45b89295885c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-deleted-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:28 compute-1 nova_compute[238822]: 2025-09-30 18:48:28.943 2 INFO nova.compute.manager [req-41ac338f-de6e-4ca6-9520-893ac2b1c7b3 req-9961241e-3263-49bb-b7ec-45b89295885c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Neutron deleted interface 04db1ec4-f9a9-4209-8ff9-65cb658acd60; detaching it from the instance and deleting it from the info cache
Sep 30 18:48:28 compute-1 nova_compute[238822]: 2025-09-30 18:48:28.944 2 DEBUG nova.network.neutron [req-41ac338f-de6e-4ca6-9520-893ac2b1c7b3 req-9961241e-3263-49bb-b7ec-45b89295885c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:48:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:29.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.330 2 DEBUG nova.compute.manager [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.330 2 DEBUG oslo_concurrency.lockutils [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.331 2 DEBUG oslo_concurrency.lockutils [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.331 2 DEBUG oslo_concurrency.lockutils [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.331 2 DEBUG nova.compute.manager [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] No waiting events found dispatching network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.332 2 DEBUG nova.compute.manager [req-2bcb0f4a-1ccb-4569-9034-718a84ece3af req-408c96eb-a2a7-42c6-9937-4cc5c68f5c16 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Received event network-vif-unplugged-04db1ec4-f9a9-4209-8ff9-65cb658acd60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.380 2 DEBUG nova.network.neutron [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.454 2 DEBUG nova.compute.manager [req-41ac338f-de6e-4ca6-9520-893ac2b1c7b3 req-9961241e-3263-49bb-b7ec-45b89295885c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Detach interface failed, port_id=04db1ec4-f9a9-4209-8ff9-65cb658acd60, reason: Instance cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:48:29 compute-1 ceph-mon[75484]: pgmap v2104: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:48:29 compute-1 nova_compute[238822]: 2025-09-30 18:48:29.889 2 INFO nova.compute.manager [-] [instance: cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072] Took 2.07 seconds to deallocate network for instance.
Sep 30 18:48:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:29.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:30 compute-1 nova_compute[238822]: 2025-09-30 18:48:30.418 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:30 compute-1 nova_compute[238822]: 2025-09-30 18:48:30.419 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:30 compute-1 nova_compute[238822]: 2025-09-30 18:48:30.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:30 compute-1 nova_compute[238822]: 2025-09-30 18:48:30.492 2 DEBUG oslo_concurrency.processutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:48:30 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:48:30 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1388053340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:31 compute-1 nova_compute[238822]: 2025-09-30 18:48:31.002 2 DEBUG oslo_concurrency.processutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:48:31 compute-1 nova_compute[238822]: 2025-09-30 18:48:31.010 2 DEBUG nova.compute.provider_tree [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:48:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:31.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:31 compute-1 nova_compute[238822]: 2025-09-30 18:48:31.520 2 DEBUG nova.scheduler.client.report [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:48:31 compute-1 ceph-mon[75484]: pgmap v2105: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:48:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1388053340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:31 compute-1 nova_compute[238822]: 2025-09-30 18:48:31.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:31.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:32 compute-1 nova_compute[238822]: 2025-09-30 18:48:32.035 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.616s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:32 compute-1 nova_compute[238822]: 2025-09-30 18:48:32.282 2 INFO nova.scheduler.client.report [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Deleted allocations for instance cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072
Sep 30 18:48:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:33.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:33 compute-1 nova_compute[238822]: 2025-09-30 18:48:33.322 2 DEBUG oslo_concurrency.lockutils [None req-19827d8c-765e-4f29-9ac0-9eae9671e66d f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb00548c-e53f-4d4e-b1b7-c3a1ae2f5072" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.847s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:33 compute-1 ceph-mon[75484]: pgmap v2106: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:48:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:33.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.866 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "656a0137-3214-4992-a68a-cdbedf0336f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.867 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.867 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.868 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.868 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:34 compute-1 nova_compute[238822]: 2025-09-30 18:48:34.885 2 INFO nova.compute.manager [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Terminating instance
Sep 30 18:48:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:35.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.412 2 DEBUG nova.compute.manager [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 kernel: tapd728eab4-88 (unregistering): left promiscuous mode
Sep 30 18:48:35 compute-1 NetworkManager[45549]: <info>  [1759258115.4883] device (tapd728eab4-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 ovn_controller[135204]: 2025-09-30T18:48:35Z|00295|binding|INFO|Releasing lport d728eab4-88db-4811-b199-c75155b08c82 from this chassis (sb_readonly=0)
Sep 30 18:48:35 compute-1 ovn_controller[135204]: 2025-09-30T18:48:35Z|00296|binding|INFO|Setting lport d728eab4-88db-4811-b199-c75155b08c82 down in Southbound
Sep 30 18:48:35 compute-1 ovn_controller[135204]: 2025-09-30T18:48:35Z|00297|binding|INFO|Removing iface tapd728eab4-88 ovn-installed in OVS
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.514 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:57:7e 10.100.0.14'], port_security=['fa:16:3e:b3:57:7e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '656a0137-3214-4992-a68a-cdbedf0336f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '14', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=d728eab4-88db-4811-b199-c75155b08c82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.515 144543 INFO neutron.agent.ovn.metadata.agent [-] Port d728eab4-88db-4811-b199-c75155b08c82 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.517 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4658d55-a8f9-48f1-846d-61df3d830821, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.518 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ee1a98dc-0f5b-4b12-bdfa-a5db80628711]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.518 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 namespace which is not needed anymore
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000024.scope: Deactivated successfully.
Sep 30 18:48:35 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000024.scope: Consumed 3.490s CPU time.
Sep 30 18:48:35 compute-1 systemd-machined[195911]: Machine qemu-27-instance-00000024 terminated.
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 podman[249638]: time="2025-09-30T18:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.664 2 INFO nova.virt.libvirt.driver [-] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Instance destroyed successfully.
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.665 2 DEBUG nova.objects.instance [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'resources' on Instance uuid 656a0137-3214-4992-a68a-cdbedf0336f6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:48:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8825 "" "Go-http-client/1.1"
Sep 30 18:48:35 compute-1 podman[303253]: 2025-09-30 18:48:35.70347047 +0000 UTC m=+0.059742444 container kill db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:48:35 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [NOTICE]   (302606) : haproxy version is 3.0.5-8e879a5
Sep 30 18:48:35 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [NOTICE]   (302606) : path to executable is /usr/sbin/haproxy
Sep 30 18:48:35 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [WARNING]  (302606) : Exiting Master process...
Sep 30 18:48:35 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [ALERT]    (302606) : Current worker (302608) exited with code 143 (Terminated)
Sep 30 18:48:35 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[302602]: [WARNING]  (302606) : All workers exited. Exiting... (0)
Sep 30 18:48:35 compute-1 sudo[303248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.712 2 DEBUG nova.compute.manager [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Received event network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.713 2 DEBUG oslo_concurrency.lockutils [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.713 2 DEBUG oslo_concurrency.lockutils [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:35 compute-1 systemd[1]: libpod-db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781.scope: Deactivated successfully.
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.714 2 DEBUG oslo_concurrency.lockutils [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.714 2 DEBUG nova.compute.manager [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] No waiting events found dispatching network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.714 2 DEBUG nova.compute.manager [req-6cefb8ef-0873-4f4c-b531-20fdd5ac18e2 req-9fc9e1ed-0fba-4dda-9d2c-2f08e93aa1fc 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Received event network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:48:35 compute-1 sudo[303248]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:35 compute-1 sudo[303248]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:35 compute-1 ceph-mon[75484]: pgmap v2107: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 30 op/s
Sep 30 18:48:35 compute-1 podman[303294]: 2025-09-30 18:48:35.779754668 +0000 UTC m=+0.044933388 container died db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 18:48:35 compute-1 sudo[303300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:48:35 compute-1 sudo[303300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-92514c849bc5c8e629ab50dcc4d98cf3ff384cabdbe94f09497b48bc5da688ae-merged.mount: Deactivated successfully.
Sep 30 18:48:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781-userdata-shm.mount: Deactivated successfully.
Sep 30 18:48:35 compute-1 podman[303294]: 2025-09-30 18:48:35.827835548 +0000 UTC m=+0.093014238 container cleanup db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:48:35 compute-1 systemd[1]: libpod-conmon-db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781.scope: Deactivated successfully.
Sep 30 18:48:35 compute-1 podman[303297]: 2025-09-30 18:48:35.855020008 +0000 UTC m=+0.108779671 container remove db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.864 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[670f24ca-dbf6-462b-9a6a-54963549f4e5]: (4, ("Tue Sep 30 06:48:35 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 (db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781)\ndb8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781\nTue Sep 30 06:48:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 (db8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781)\ndb8b0cd7c78234eb3083fcb5c777aa31ead71cdd601c861d4b79ac56bb7d7781\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.867 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[88dccec0-5109-4f0a-9ed9-f7c4d8a1aacd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.868 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.868 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ca887c17-f5f8-4f01-8c1b-2d9554b700f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.869 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 kernel: tapf4658d55-a0: left promiscuous mode
Sep 30 18:48:35 compute-1 nova_compute[238822]: 2025-09-30 18:48:35.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.907 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cb654b-0482-4306-a4ba-d4f596f7af07]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.936 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4a0e97-b0ec-4c69-9c6b-f4b062dcb996]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.937 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[43cc20da-6907-4678-9b1b-4105ab295237]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.965 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4a07206d-7cff-416f-984b-ea03ebecc1a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1589852, 'reachable_time': 18084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303353, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.969 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:48:35 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:35.970 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c4a693-e40b-435b-ada1-d6e3fced92c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:48:35 compute-1 systemd[1]: run-netns-ovnmeta\x2df4658d55\x2da8f9\x2d48f1\x2d846d\x2d61df3d830821.mount: Deactivated successfully.
Sep 30 18:48:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:35.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.175 2 DEBUG nova.virt.libvirt.vif [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:46:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-752609519',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-752609519',id=36,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:47:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-9hll55u7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:48:18Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=656a0137-3214-4992-a68a-cdbedf0336f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.176 2 DEBUG nova.network.os_vif_util [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "d728eab4-88db-4811-b199-c75155b08c82", "address": "fa:16:3e:b3:57:7e", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd728eab4-88", "ovs_interfaceid": "d728eab4-88db-4811-b199-c75155b08c82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.177 2 DEBUG nova.network.os_vif_util [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.177 2 DEBUG os_vif [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.180 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd728eab4-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b4818e22-d37c-4de7-b17d-af613c865afe) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.235 2 INFO os_vif [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:57:7e,bridge_name='br-int',has_traffic_filtering=True,id=d728eab4-88db-4811-b199-c75155b08c82,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd728eab4-88')
Sep 30 18:48:36 compute-1 sudo[303300]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.693 2 INFO nova.virt.libvirt.driver [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Deleting instance files /var/lib/nova/instances/656a0137-3214-4992-a68a-cdbedf0336f6_del
Sep 30 18:48:36 compute-1 nova_compute[238822]: 2025-09-30 18:48:36.694 2 INFO nova.virt.libvirt.driver [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Deletion of /var/lib/nova/instances/656a0137-3214-4992-a68a-cdbedf0336f6_del complete
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2151263999' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2151263999' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: pgmap v2108: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 9.3 KiB/s wr, 30 op/s
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:48:36 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:48:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.213 2 INFO nova.compute.manager [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Took 1.80 seconds to destroy the instance on the hypervisor.
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.213 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.214 2 DEBUG nova.compute.manager [-] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.214 2 DEBUG nova.network.neutron [-] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.215 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:37.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.532 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:48:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.803 2 DEBUG nova.compute.manager [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Received event network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.804 2 DEBUG oslo_concurrency.lockutils [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.804 2 DEBUG oslo_concurrency.lockutils [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.804 2 DEBUG oslo_concurrency.lockutils [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.805 2 DEBUG nova.compute.manager [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] No waiting events found dispatching network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:48:37 compute-1 nova_compute[238822]: 2025-09-30 18:48:37.805 2 DEBUG nova.compute.manager [req-9e08b339-41b5-441d-96f2-168dbe985da1 req-84e1e84b-1875-4d10-960b-1183754c9ee5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Received event network-vif-unplugged-d728eab4-88db-4811-b199-c75155b08c82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:48:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:37.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:38 compute-1 nova_compute[238822]: 2025-09-30 18:48:38.416 2 DEBUG nova.network.neutron [-] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:48:38 compute-1 ceph-mon[75484]: pgmap v2109: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 39 KiB/s rd, 10 KiB/s wr, 58 op/s
Sep 30 18:48:38 compute-1 nova_compute[238822]: 2025-09-30 18:48:38.932 2 INFO nova.compute.manager [-] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Took 1.72 seconds to deallocate network for instance.
Sep 30 18:48:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:39.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:39 compute-1 nova_compute[238822]: 2025-09-30 18:48:39.460 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:39 compute-1 nova_compute[238822]: 2025-09-30 18:48:39.461 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:39 compute-1 nova_compute[238822]: 2025-09-30 18:48:39.467 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:39 compute-1 nova_compute[238822]: 2025-09-30 18:48:39.517 2 INFO nova.scheduler.client.report [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Deleted allocations for instance 656a0137-3214-4992-a68a-cdbedf0336f6
Sep 30 18:48:39 compute-1 nova_compute[238822]: 2025-09-30 18:48:39.902 2 DEBUG nova.compute.manager [req-d1f6dbc2-1625-4bd8-9287-49740454676e req-c31cc8fd-89f7-4bcc-99e5-d90e56b76caf 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 656a0137-3214-4992-a68a-cdbedf0336f6] Received event network-vif-deleted-d728eab4-88db-4811-b199-c75155b08c82 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:48:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:39.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:40 compute-1 nova_compute[238822]: 2025-09-30 18:48:40.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:40 compute-1 nova_compute[238822]: 2025-09-30 18:48:40.554 2 DEBUG oslo_concurrency.lockutils [None req-ccb29e4a-fd8d-47ae-b1e9-4d46ad220970 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "656a0137-3214-4992-a68a-cdbedf0336f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.688s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:41 compute-1 nova_compute[238822]: 2025-09-30 18:48:41.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:41.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:41 compute-1 ceph-mon[75484]: pgmap v2110: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:48:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:41.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:42 compute-1 sudo[303410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:48:42 compute-1 sudo[303410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:42 compute-1 sudo[303410]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:42 compute-1 sudo[303433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:48:42 compute-1 sudo[303433]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:48:42 compute-1 sudo[303433]: pam_unix(sudo:session): session closed for user root
Sep 30 18:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:48:42 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:48:42 compute-1 ceph-mon[75484]: pgmap v2111: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:48:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:43.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:43.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:45.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:45 compute-1 nova_compute[238822]: 2025-09-30 18:48:45.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:45 compute-1 ceph-mon[75484]: pgmap v2112: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:48:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:45.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:46 compute-1 nova_compute[238822]: 2025-09-30 18:48:46.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:46 compute-1 podman[303466]: 2025-09-30 18:48:46.545381538 +0000 UTC m=+0.076889565 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:48:46 compute-1 podman[303465]: 2025-09-30 18:48:46.611522173 +0000 UTC m=+0.153351157 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:48:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:47.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:47 compute-1 ceph-mon[75484]: pgmap v2113: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:48:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:47.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:48 compute-1 unix_chkpwd[303521]: password check failed for user (root)
Sep 30 18:48:48 compute-1 sshd-session[303517]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:48:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:49.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: ERROR   18:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: ERROR   18:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: ERROR   18:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: ERROR   18:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: ERROR   18:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:48:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:48:49 compute-1 ceph-mon[75484]: pgmap v2114: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:48:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:49.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:50 compute-1 nova_compute[238822]: 2025-09-30 18:48:50.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:50 compute-1 podman[303523]: 2025-09-30 18:48:50.534945078 +0000 UTC m=+0.079725301 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 18:48:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:51 compute-1 sshd-session[303517]: Failed password for root from 192.210.160.141 port 50614 ssh2
Sep 30 18:48:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:51.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:51 compute-1 nova_compute[238822]: 2025-09-30 18:48:51.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:51 compute-1 ceph-mon[75484]: pgmap v2115: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:48:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:48:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:51.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:48:52 compute-1 sshd-session[303517]: Connection closed by authenticating user root 192.210.160.141 port 50614 [preauth]
Sep 30 18:48:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:48:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:53.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:53 compute-1 ceph-mon[75484]: pgmap v2116: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:48:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:48:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:53.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:48:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:54.431 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:54.432 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:48:54.432 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:55.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:55 compute-1 nova_compute[238822]: 2025-09-30 18:48:55.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:55 compute-1 ceph-mon[75484]: pgmap v2117: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:48:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:55.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:56 compute-1 nova_compute[238822]: 2025-09-30 18:48:56.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:48:56 compute-1 podman[303551]: 2025-09-30 18:48:56.552111169 +0000 UTC m=+0.091357513 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, managed_by=edpm_ansible)
Sep 30 18:48:56 compute-1 podman[303552]: 2025-09-30 18:48:56.577380268 +0000 UTC m=+0.109349546 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Sep 30 18:48:56 compute-1 podman[303553]: 2025-09-30 18:48:56.589190055 +0000 UTC m=+0.120000072 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:48:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:57.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:48:57 compute-1 nova_compute[238822]: 2025-09-30 18:48:57.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:48:57 compute-1 ceph-mon[75484]: pgmap v2118: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:48:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3074177764' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:48:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3074177764' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:48:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:48:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:48:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4288130120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.080 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:48:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.334 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.335 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.367 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.368 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4672MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.368 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:48:58 compute-1 nova_compute[238822]: 2025-09-30 18:48:58.369 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:48:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4288130120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:48:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:48:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:48:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:48:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:48:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:48:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:48:59.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:48:59 compute-1 nova_compute[238822]: 2025-09-30 18:48:59.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:48:59 compute-1 nova_compute[238822]: 2025-09-30 18:48:59.422 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:48:58 up  4:26,  0 user,  load average: 0.50, 0.42, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:48:59 compute-1 nova_compute[238822]: 2025-09-30 18:48:59.437 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:48:59 compute-1 ceph-mon[75484]: pgmap v2119: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:48:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:48:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2510765074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:48:59 compute-1 nova_compute[238822]: 2025-09-30 18:48:59.914 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:48:59 compute-1 nova_compute[238822]: 2025-09-30 18:48:59.922 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:49:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:00 compute-1 nova_compute[238822]: 2025-09-30 18:49:00.431 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:49:00 compute-1 nova_compute[238822]: 2025-09-30 18:49:00.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2510765074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1850080561' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:00 compute-1 ceph-mon[75484]: pgmap v2120: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:49:00 compute-1 nova_compute[238822]: 2025-09-30 18:49:00.944 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:49:00 compute-1 nova_compute[238822]: 2025-09-30 18:49:00.945 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.576s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:01.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:01 compute-1 nova_compute[238822]: 2025-09-30 18:49:01.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1207412588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:01 compute-1 nova_compute[238822]: 2025-09-30 18:49:01.945 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:01 compute-1 nova_compute[238822]: 2025-09-30 18:49:01.946 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:01 compute-1 nova_compute[238822]: 2025-09-30 18:49:01.946 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:49:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:02.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:02 compute-1 nova_compute[238822]: 2025-09-30 18:49:02.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:02 compute-1 sudo[303665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:49:02 compute-1 sudo[303665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:02 compute-1 sudo[303665]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:02 compute-1 ceph-mon[75484]: pgmap v2121: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:49:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:03.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:04.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:04.351 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:49:04 compute-1 nova_compute[238822]: 2025-09-30 18:49:04.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:04 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:04.352 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:49:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1045675100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:04 compute-1 sshd-session[303692]: Invalid user dci from 161.132.50.17 port 32974
Sep 30 18:49:04 compute-1 sshd-session[303692]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:49:04 compute-1 sshd-session[303692]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:49:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:05.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:05 compute-1 ceph-mon[75484]: pgmap v2122: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:49:05 compute-1 nova_compute[238822]: 2025-09-30 18:49:05.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:05 compute-1 podman[249638]: time="2025-09-30T18:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:49:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:49:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8358 "" "Go-http-client/1.1"
Sep 30 18:49:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:06.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:06 compute-1 nova_compute[238822]: 2025-09-30 18:49:06.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:07 compute-1 sshd-session[303692]: Failed password for invalid user dci from 161.132.50.17 port 32974 ssh2
Sep 30 18:49:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:07.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:07 compute-1 ceph-mon[75484]: pgmap v2123: 353 pgs: 353 active+clean; 41 MiB data, 399 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:49:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:49:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:08.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:08 compute-1 nova_compute[238822]: 2025-09-30 18:49:08.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:08 compute-1 sshd-session[303692]: Received disconnect from 161.132.50.17 port 32974:11: Bye Bye [preauth]
Sep 30 18:49:08 compute-1 sshd-session[303692]: Disconnected from invalid user dci 161.132.50.17 port 32974 [preauth]
Sep 30 18:49:09 compute-1 sshd-session[303700]: Invalid user chris from 8.243.64.201 port 34104
Sep 30 18:49:09 compute-1 sshd-session[303700]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:49:09 compute-1 sshd-session[303700]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:49:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:09.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:09 compute-1 ceph-mon[75484]: pgmap v2124: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:49:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:10.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:10 compute-1 nova_compute[238822]: 2025-09-30 18:49:10.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:10 compute-1 nova_compute[238822]: 2025-09-30 18:49:10.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4196521684' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:11 compute-1 sshd-session[303700]: Failed password for invalid user chris from 8.243.64.201 port 34104 ssh2
Sep 30 18:49:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:11.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:11 compute-1 nova_compute[238822]: 2025-09-30 18:49:11.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:11 compute-1 ceph-mon[75484]: pgmap v2125: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:49:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1626535892' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:12.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:12 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:12.355 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:13 compute-1 sshd-session[303700]: Received disconnect from 8.243.64.201 port 34104:11: Bye Bye [preauth]
Sep 30 18:49:13 compute-1 sshd-session[303700]: Disconnected from invalid user chris 8.243.64.201 port 34104 [preauth]
Sep 30 18:49:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:13.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:13 compute-1 ceph-mon[75484]: pgmap v2126: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:49:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:14.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:15 compute-1 nova_compute[238822]: 2025-09-30 18:49:15.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:15.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:15 compute-1 nova_compute[238822]: 2025-09-30 18:49:15.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:15 compute-1 ceph-mon[75484]: pgmap v2127: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:49:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:49:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:49:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:16 compute-1 nova_compute[238822]: 2025-09-30 18:49:16.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:16 compute-1 unix_chkpwd[303714]: password check failed for user (root)
Sep 30 18:49:16 compute-1 sshd-session[303708]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:49:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:17.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:17 compute-1 sshd-session[303711]: Invalid user alex from 49.49.32.245 port 51558
Sep 30 18:49:17 compute-1 sshd-session[303711]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:49:17 compute-1 sshd-session[303711]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:49:17 compute-1 podman[303717]: 2025-09-30 18:49:17.482977134 +0000 UTC m=+0.095580525 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:49:17 compute-1 podman[303716]: 2025-09-30 18:49:17.575766535 +0000 UTC m=+0.192924589 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Sep 30 18:49:17 compute-1 ceph-mon[75484]: pgmap v2128: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:49:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:49:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:49:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:18 compute-1 sshd-session[303711]: Failed password for invalid user alex from 49.49.32.245 port 51558 ssh2
Sep 30 18:49:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:19.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:19 compute-1 sshd-session[303708]: Failed password for root from 192.210.160.141 port 40890 ssh2
Sep 30 18:49:19 compute-1 sshd-session[303711]: Received disconnect from 49.49.32.245 port 51558:11: Bye Bye [preauth]
Sep 30 18:49:19 compute-1 sshd-session[303711]: Disconnected from invalid user alex 49.49.32.245 port 51558 [preauth]
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: ERROR   18:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: ERROR   18:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: ERROR   18:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: ERROR   18:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: ERROR   18:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:49:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:49:19 compute-1 ceph-mon[75484]: pgmap v2129: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Sep 30 18:49:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:20.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:20 compute-1 nova_compute[238822]: 2025-09-30 18:49:20.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:20 compute-1 ceph-mon[75484]: pgmap v2130: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:49:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:21.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:21 compute-1 nova_compute[238822]: 2025-09-30 18:49:21.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:21 compute-1 podman[303770]: 2025-09-30 18:49:21.599779099 +0000 UTC m=+0.096573174 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:49:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:22.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:22 compute-1 sudo[303790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:49:22 compute-1 sudo[303790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:22 compute-1 sudo[303790]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:49:22 compute-1 sshd-session[303708]: Connection closed by authenticating user root 192.210.160.141 port 40890 [preauth]
Sep 30 18:49:22 compute-1 nova_compute[238822]: 2025-09-30 18:49:22.611 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:22 compute-1 nova_compute[238822]: 2025-09-30 18:49:22.612 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:23 compute-1 nova_compute[238822]: 2025-09-30 18:49:23.122 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:49:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:23.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:23 compute-1 ceph-mon[75484]: pgmap v2131: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:49:23 compute-1 nova_compute[238822]: 2025-09-30 18:49:23.688 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:23 compute-1 nova_compute[238822]: 2025-09-30 18:49:23.689 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:23 compute-1 nova_compute[238822]: 2025-09-30 18:49:23.700 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:49:23 compute-1 nova_compute[238822]: 2025-09-30 18:49:23.701 2 INFO nova.compute.claims [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:49:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:24.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:24 compute-1 nova_compute[238822]: 2025-09-30 18:49:24.781 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:49:25 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/758503153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:25 compute-1 nova_compute[238822]: 2025-09-30 18:49:25.256 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:25 compute-1 nova_compute[238822]: 2025-09-30 18:49:25.267 2 DEBUG nova.compute.provider_tree [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:49:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:25.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:25 compute-1 nova_compute[238822]: 2025-09-30 18:49:25.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:25 compute-1 ceph-mon[75484]: pgmap v2132: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:49:25 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/758503153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:25 compute-1 nova_compute[238822]: 2025-09-30 18:49:25.780 2 DEBUG nova.scheduler.client.report [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:49:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:26.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.292 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.603s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.293 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.809 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.809 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.810 2 WARNING neutronclient.v2_0.client [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:26 compute-1 nova_compute[238822]: 2025-09-30 18:49:26.811 2 WARNING neutronclient.v2_0.client [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:27 compute-1 nova_compute[238822]: 2025-09-30 18:49:27.320 2 INFO nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:49:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:27.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:27 compute-1 podman[303842]: 2025-09-30 18:49:27.572439056 +0000 UTC m=+0.105150633 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid)
Sep 30 18:49:27 compute-1 podman[303843]: 2025-09-30 18:49:27.591689013 +0000 UTC m=+0.118192653 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_id=edpm, version=9.6)
Sep 30 18:49:27 compute-1 podman[303844]: 2025-09-30 18:49:27.608088843 +0000 UTC m=+0.129573299 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:49:27 compute-1 ceph-mon[75484]: pgmap v2133: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Sep 30 18:49:27 compute-1 nova_compute[238822]: 2025-09-30 18:49:27.784 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Successfully created port: 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:49:27 compute-1 nova_compute[238822]: 2025-09-30 18:49:27.829 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:49:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.778 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Successfully updated port: 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.851 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.853 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.853 2 INFO nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Creating image(s)
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.894 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.939 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.982 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:28 compute-1 nova_compute[238822]: 2025-09-30 18:49:28.987 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.006 2 DEBUG nova.compute.manager [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-changed-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.007 2 DEBUG nova.compute.manager [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Refreshing instance network info cache due to event network-changed-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.008 2 DEBUG oslo_concurrency.lockutils [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.008 2 DEBUG oslo_concurrency.lockutils [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.008 2 DEBUG nova.network.neutron [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Refreshing network info cache for port 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.084 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.085 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.086 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.086 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.128 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.134 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 838798ef-0563-40a9-af50-22403624c69e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.290 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:49:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:29.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.464 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 838798ef-0563-40a9-af50-22403624c69e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.331s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.563 2 WARNING neutronclient.v2_0.client [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.576 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] resizing rbd image 838798ef-0563-40a9-af50-22403624c69e_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:49:29 compute-1 ceph-mon[75484]: pgmap v2134: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 138 op/s
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.728 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.729 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Ensure instance console log exists: /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.729 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.730 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:29 compute-1 nova_compute[238822]: 2025-09-30 18:49:29.730 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:30.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.107 2 DEBUG nova.network.neutron [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:49:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.274 2 DEBUG nova.network.neutron [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.783 2 DEBUG oslo_concurrency.lockutils [req-6e20a948-bba6-44e1-866d-e5c08fbd017f req-dbeca8d8-ba20-4e46-b68c-98fc23afc867 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.784 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquired lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:49:30 compute-1 nova_compute[238822]: 2025-09-30 18:49:30.785 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:49:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:31.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:31 compute-1 nova_compute[238822]: 2025-09-30 18:49:31.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:31 compute-1 nova_compute[238822]: 2025-09-30 18:49:31.595 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:49:31 compute-1 ceph-mon[75484]: pgmap v2135: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:49:31 compute-1 nova_compute[238822]: 2025-09-30 18:49:31.835 2 WARNING neutronclient.v2_0.client [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:32.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.063 2 DEBUG nova.network.neutron [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Updating instance_info_cache with network_info: [{"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:49:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.576 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Releasing lock "refresh_cache-838798ef-0563-40a9-af50-22403624c69e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.577 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance network_info: |[{"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.581 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Start _get_guest_xml network_info=[{"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.588 2 WARNING nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.590 2 DEBUG nova.virt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1073933630', uuid='838798ef-0563-40a9-af50-22403624c69e'), owner=OwnerMeta(userid='f560266d133f4f1ba4a908e3cdcfa59d', username='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin', projectid='3359c464e0344756a39ce5c7088b9eba', projectname='tempest-TestExecuteZoneMigrationStrategy-613400940'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759258172.5906136) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.597 2 DEBUG nova.virt.libvirt.host [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.598 2 DEBUG nova.virt.libvirt.host [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.602 2 DEBUG nova.virt.libvirt.host [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.603 2 DEBUG nova.virt.libvirt.host [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.603 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.604 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.604 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.605 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.605 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.606 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.606 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.607 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.607 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.608 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.608 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.608 2 DEBUG nova.virt.hardware [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:49:32 compute-1 nova_compute[238822]: 2025-09-30 18:49:32.614 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:49:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3376486055' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.120 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.162 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.169 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:33.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:49:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2880843357' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.655 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.658 2 DEBUG nova.virt.libvirt.vif [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:49:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1073933630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1073933630',id=39,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-a039zebv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:49:27Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=838798ef-0563-40a9-af50-22403624c69e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.658 2 DEBUG nova.network.os_vif_util [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.659 2 DEBUG nova.network.os_vif_util [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:49:33 compute-1 nova_compute[238822]: 2025-09-30 18:49:33.660 2 DEBUG nova.objects.instance [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'pci_devices' on Instance uuid 838798ef-0563-40a9-af50-22403624c69e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:49:33 compute-1 ceph-mon[75484]: pgmap v2136: 353 pgs: 353 active+clean; 121 MiB data, 442 MiB used, 40 GiB / 40 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:49:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3376486055' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2880843357' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:49:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:34.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.171 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <uuid>838798ef-0563-40a9-af50-22403624c69e</uuid>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <name>instance-00000027</name>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1073933630</nova:name>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:49:32</nova:creationTime>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:49:34 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:49:34 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:user uuid="f560266d133f4f1ba4a908e3cdcfa59d">tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin</nova:user>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:project uuid="3359c464e0344756a39ce5c7088b9eba">tempest-TestExecuteZoneMigrationStrategy-613400940</nova:project>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <nova:port uuid="7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12">
Sep 30 18:49:34 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <system>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="serial">838798ef-0563-40a9-af50-22403624c69e</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="uuid">838798ef-0563-40a9-af50-22403624c69e</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </system>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <os>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </os>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <features>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </features>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/838798ef-0563-40a9-af50-22403624c69e_disk">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </source>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/838798ef-0563-40a9-af50-22403624c69e_disk.config">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </source>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:49:34 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:de:4f:65"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <target dev="tap7f72ad94-f1"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/console.log" append="off"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <video>
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </video>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:49:34 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:49:34 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:49:34 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:49:34 compute-1 nova_compute[238822]: </domain>
Sep 30 18:49:34 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.171 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Preparing to wait for external event network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.171 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.172 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.172 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.172 2 DEBUG nova.virt.libvirt.vif [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:49:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1073933630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1073933630',id=39,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-a039zebv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:49:27Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=838798ef-0563-40a9-af50-22403624c69e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.173 2 DEBUG nova.network.os_vif_util [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.173 2 DEBUG nova.network.os_vif_util [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.173 2 DEBUG os_vif [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.175 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.175 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0a5f2f67-71d9-5821-bbe2-3cb57476b6f1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.184 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f72ad94-f1, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.184 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7f72ad94-f1, col_values=(('qos', UUID('001c7758-181e-45ae-ada6-cf83d1acd8a0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7f72ad94-f1, col_values=(('external_ids', {'iface-id': '7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:4f:65', 'vm-uuid': '838798ef-0563-40a9-af50-22403624c69e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:49:34 compute-1 NetworkManager[45549]: <info>  [1759258174.1890] manager: (tap7f72ad94-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:34 compute-1 nova_compute[238822]: 2025-09-30 18:49:34.196 2 INFO os_vif [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1')
Sep 30 18:49:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:35.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:35 compute-1 podman[249638]: time="2025-09-30T18:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:49:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:49:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8366 "" "Go-http-client/1.1"
Sep 30 18:49:35 compute-1 ceph-mon[75484]: pgmap v2137: 353 pgs: 353 active+clean; 167 MiB data, 463 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.750 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.750 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.751 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] No VIF found with MAC fa:16:3e:de:4f:65, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.751 2 INFO nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Using config drive
Sep 30 18:49:35 compute-1 nova_compute[238822]: 2025-09-30 18:49:35.794 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:36.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:36 compute-1 nova_compute[238822]: 2025-09-30 18:49:36.320 2 WARNING neutronclient.v2_0.client [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3251208783' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:49:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3251208783' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:49:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.223 2 INFO nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Creating config drive at /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.234 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpz_4t5tds execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:37.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.388 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpz_4t5tds" returned: 0 in 0.154s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.434 2 DEBUG nova.storage.rbd_utils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] rbd image 838798ef-0563-40a9-af50-22403624c69e_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.443 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config 838798ef-0563-40a9-af50-22403624c69e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.663 2 DEBUG oslo_concurrency.processutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config 838798ef-0563-40a9-af50-22403624c69e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.668 2 INFO nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Deleting local config drive /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e/disk.config because it was imported into RBD.
Sep 30 18:49:37 compute-1 ceph-mon[75484]: pgmap v2138: 353 pgs: 353 active+clean; 167 MiB data, 463 MiB used, 40 GiB / 40 GiB avail; 344 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Sep 30 18:49:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:49:37 compute-1 kernel: tap7f72ad94-f1: entered promiscuous mode
Sep 30 18:49:37 compute-1 NetworkManager[45549]: <info>  [1759258177.7720] manager: (tap7f72ad94-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:37 compute-1 ovn_controller[135204]: 2025-09-30T18:49:37Z|00298|binding|INFO|Claiming lport 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 for this chassis.
Sep 30 18:49:37 compute-1 ovn_controller[135204]: 2025-09-30T18:49:37Z|00299|binding|INFO|7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12: Claiming fa:16:3e:de:4f:65 10.100.0.5
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.797 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:4f:65 10.100.0.5'], port_security=['fa:16:3e:de:4f:65 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '838798ef-0563-40a9-af50-22403624c69e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.799 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 bound to our chassis
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.801 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:49:37 compute-1 ovn_controller[135204]: 2025-09-30T18:49:37Z|00300|binding|INFO|Setting lport 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 ovn-installed in OVS
Sep 30 18:49:37 compute-1 ovn_controller[135204]: 2025-09-30T18:49:37Z|00301|binding|INFO|Setting lport 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 up in Southbound
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:37 compute-1 nova_compute[238822]: 2025-09-30 18:49:37.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.827 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac39455-de7c-4c78-8a9e-fa4bd22ec495]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.828 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4658d55-a1 in ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.832 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4658d55-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.832 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8158d0c3-d62e-498a-bd4f-68cd8e8c2838]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.834 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a1089f6e-f750-4920-83ee-0fde7ce1cc10]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 systemd-machined[195911]: New machine qemu-28-instance-00000027.
Sep 30 18:49:37 compute-1 systemd[1]: Started Virtual Machine qemu-28-instance-00000027.
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.857 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b51331-3a1d-49f7-a2d4-f298d751cbc9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 systemd-udevd[304218]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.878 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[448dadcc-c16b-44af-8697-5e7b40b670e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 NetworkManager[45549]: <info>  [1759258177.8975] device (tap7f72ad94-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:49:37 compute-1 NetworkManager[45549]: <info>  [1759258177.8996] device (tap7f72ad94-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.930 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[75fc00f9-55ce-413a-8f72-d4b68558349f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.937 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d2ee85-b8e8-458c-b588-dea89d807664]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 NetworkManager[45549]: <info>  [1759258177.9401] manager: (tapf4658d55-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.995 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[60a263ee-a241-442c-89ab-25ae77003071]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:37.999 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[eea5d367-5797-4b86-8dd2-632fee9ddb38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 NetworkManager[45549]: <info>  [1759258178.0273] device (tapf4658d55-a0): carrier: link connected
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.042 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1d754b-7803-4cc9-b020-94523f1a0651]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:38.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.067 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c2416984-3353-4717-b9c9-57f14a876a9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1601960, 'reachable_time': 16822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304248, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.089 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[269da78d-6d0f-4f42-b6ee-0aa171392500]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:a899'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1601960, 'tstamp': 1601960}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304249, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.119 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd0482d-733f-41d0-8fe3-cb89ed87dd0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1601960, 'reachable_time': 16822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304250, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.158 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c98b8932-de4b-40a2-9601-dad52a090ae1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.247 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fd3d5a-0f39-46be-ab21-f2d0792ba975]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.249 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.249 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.250 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:38 compute-1 kernel: tapf4658d55-a0: entered promiscuous mode
Sep 30 18:49:38 compute-1 NetworkManager[45549]: <info>  [1759258178.2536] manager: (tapf4658d55-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.256 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:38 compute-1 ovn_controller[135204]: 2025-09-30T18:49:38Z|00302|binding|INFO|Releasing lport 862fbe9e-132a-4b8a-83f6-7b020c6192ad from this chassis (sb_readonly=0)
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.284 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3114d922-66d9-427b-af91-c7978d70676a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.286 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.286 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.286 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f4658d55-a8f9-48f1-846d-61df3d830821 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.286 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.287 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[09b26728-d4df-45e2-a185-f9acd6532295]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.288 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.289 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[77bad720-71dc-4f9e-8344-05cb389a194d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.291 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:49:38 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:38.292 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'env', 'PROCESS_TAG=haproxy-f4658d55-a8f9-48f1-846d-61df3d830821', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4658d55-a8f9-48f1-846d-61df3d830821.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.293 2 DEBUG nova.compute.manager [req-3bfb5a54-a5e8-458e-8ff2-dcf9a4f84352 req-75101b8c-34fc-4681-b122-09171146df2b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.293 2 DEBUG oslo_concurrency.lockutils [req-3bfb5a54-a5e8-458e-8ff2-dcf9a4f84352 req-75101b8c-34fc-4681-b122-09171146df2b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.293 2 DEBUG oslo_concurrency.lockutils [req-3bfb5a54-a5e8-458e-8ff2-dcf9a4f84352 req-75101b8c-34fc-4681-b122-09171146df2b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.293 2 DEBUG oslo_concurrency.lockutils [req-3bfb5a54-a5e8-458e-8ff2-dcf9a4f84352 req-75101b8c-34fc-4681-b122-09171146df2b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:38 compute-1 nova_compute[238822]: 2025-09-30 18:49:38.294 2 DEBUG nova.compute.manager [req-3bfb5a54-a5e8-458e-8ff2-dcf9a4f84352 req-75101b8c-34fc-4681-b122-09171146df2b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Processing event network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:49:38 compute-1 podman[304283]: 2025-09-30 18:49:38.760549526 +0000 UTC m=+0.071318795 container create 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:49:38 compute-1 systemd[1]: Started libpod-conmon-21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53.scope.
Sep 30 18:49:38 compute-1 podman[304283]: 2025-09-30 18:49:38.713663958 +0000 UTC m=+0.024433217 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:49:38 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:49:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826e994b845d6dcf2d190a19fd3161ef15c374b8ba9a892cfaf7ef4004bfe95d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:49:38 compute-1 podman[304283]: 2025-09-30 18:49:38.851867667 +0000 UTC m=+0.162636956 container init 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:49:38 compute-1 podman[304283]: 2025-09-30 18:49:38.859769449 +0000 UTC m=+0.170538718 container start 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Sep 30 18:49:38 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [NOTICE]   (304343) : New worker (304345) forked
Sep 30 18:49:38 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [NOTICE]   (304343) : Loading success.
Sep 30 18:49:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:39 compute-1 nova_compute[238822]: 2025-09-30 18:49:39.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:39.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:39 compute-1 nova_compute[238822]: 2025-09-30 18:49:39.528 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:49:39 compute-1 nova_compute[238822]: 2025-09-30 18:49:39.534 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:49:39 compute-1 nova_compute[238822]: 2025-09-30 18:49:39.539 2 INFO nova.virt.libvirt.driver [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance spawned successfully.
Sep 30 18:49:39 compute-1 nova_compute[238822]: 2025-09-30 18:49:39.541 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:49:39 compute-1 ceph-mon[75484]: pgmap v2139: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 347 KiB/s rd, 3.9 MiB/s wr, 97 op/s
Sep 30 18:49:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:40.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.065 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.066 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.067 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.068 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.068 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.069 2 DEBUG nova.virt.libvirt.driver [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:49:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.389 2 DEBUG nova.compute.manager [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.391 2 DEBUG oslo_concurrency.lockutils [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.394 2 DEBUG oslo_concurrency.lockutils [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.395 2 DEBUG oslo_concurrency.lockutils [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.396 2 DEBUG nova.compute.manager [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] No waiting events found dispatching network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.396 2 WARNING nova.compute.manager [req-2e63d108-764a-42a2-8d77-7433c71de42f req-3c5ddc05-755b-4392-9a42-351d6c17c1ed 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received unexpected event network-vif-plugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 for instance with vm_state building and task_state spawning.
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.583 2 INFO nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Took 11.73 seconds to spawn the instance on the hypervisor.
Sep 30 18:49:40 compute-1 nova_compute[238822]: 2025-09-30 18:49:40.584 2 DEBUG nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:49:40 compute-1 ceph-mon[75484]: pgmap v2140: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:49:41 compute-1 nova_compute[238822]: 2025-09-30 18:49:41.133 2 INFO nova.compute.manager [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Took 17.50 seconds to build instance.
Sep 30 18:49:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:41.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:41 compute-1 nova_compute[238822]: 2025-09-30 18:49:41.641 2 DEBUG oslo_concurrency.lockutils [None req-8a75c4a0-2729-4154-ad17-b8edf180c4b8 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.029s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:42.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:42 compute-1 sudo[304358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:49:42 compute-1 sudo[304358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:42 compute-1 sudo[304358]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:42 compute-1 sudo[304384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:49:42 compute-1 sudo[304384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:42 compute-1 sudo[304409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:49:42 compute-1 sudo[304409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:42 compute-1 sudo[304409]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:42 compute-1 sudo[304384]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:42 compute-1 sudo[304453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:49:42 compute-1 sudo[304453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:42 compute-1 sudo[304453]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:43 compute-1 sudo[304479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:49:43 compute-1 sudo[304479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:43.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:43 compute-1 ceph-mon[75484]: pgmap v2141: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Sep 30 18:49:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:43 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:43 compute-1 sudo[304479]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:49:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:44.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:49:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:44 compute-1 nova_compute[238822]: 2025-09-30 18:49:44.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:49:44 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:49:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:45.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:45 compute-1 nova_compute[238822]: 2025-09-30 18:49:45.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:45 compute-1 ceph-mon[75484]: pgmap v2142: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 1.9 MiB/s wr, 109 op/s
Sep 30 18:49:45 compute-1 sshd-session[304530]: Invalid user leo from 192.210.160.141 port 56838
Sep 30 18:49:45 compute-1 sshd-session[304530]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:49:45 compute-1 sshd-session[304530]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:49:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:46.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:47 compute-1 sshd-session[304530]: Failed password for invalid user leo from 192.210.160.141 port 56838 ssh2
Sep 30 18:49:47 compute-1 ceph-mon[75484]: pgmap v2143: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 28 KiB/s wr, 80 op/s
Sep 30 18:49:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:48.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:48 compute-1 podman[304544]: 2025-09-30 18:49:48.563101976 +0000 UTC m=+0.094277621 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:49:48 compute-1 podman[304543]: 2025-09-30 18:49:48.632412727 +0000 UTC m=+0.172695467 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:49:48 compute-1 sudo[304591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:49:48 compute-1 sudo[304591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:49:48 compute-1 sudo[304591]: pam_unix(sudo:session): session closed for user root
Sep 30 18:49:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:49 compute-1 nova_compute[238822]: 2025-09-30 18:49:49.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:49.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:49 compute-1 sshd-session[304530]: Connection closed by invalid user leo 192.210.160.141 port 56838 [preauth]
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: ERROR   18:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: ERROR   18:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: ERROR   18:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: ERROR   18:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: ERROR   18:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:49:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:49:49 compute-1 ceph-mon[75484]: pgmap v2144: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 2.1 MiB/s rd, 28 KiB/s wr, 80 op/s
Sep 30 18:49:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:49 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:49:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:50.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:50 compute-1 nova_compute[238822]: 2025-09-30 18:49:50.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:51.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:51 compute-1 ceph-mon[75484]: pgmap v2145: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 1.1 KiB/s wr, 74 op/s
Sep 30 18:49:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:52.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:52 compute-1 nova_compute[238822]: 2025-09-30 18:49:52.357 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Creating tmpfile /var/lib/nova/instances/tmp_10ptqj0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 18:49:52 compute-1 nova_compute[238822]: 2025-09-30 18:49:52.358 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:52 compute-1 nova_compute[238822]: 2025-09-30 18:49:52.436 2 DEBUG nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_10ptqj0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 18:49:52 compute-1 podman[304621]: 2025-09-30 18:49:52.548278168 +0000 UTC m=+0.084896339 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 18:49:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:49:52 compute-1 ovn_controller[135204]: 2025-09-30T18:49:52Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:4f:65 10.100.0.5
Sep 30 18:49:52 compute-1 ovn_controller[135204]: 2025-09-30T18:49:52Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:4f:65 10.100.0.5
Sep 30 18:49:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:53 compute-1 ceph-mon[75484]: pgmap v2146: 353 pgs: 353 active+clean; 167 MiB data, 464 MiB used, 40 GiB / 40 GiB avail; 2.0 MiB/s rd, 1.1 KiB/s wr, 74 op/s
Sep 30 18:49:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:49:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:54.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:49:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:54 compute-1 nova_compute[238822]: 2025-09-30 18:49:54.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:54.433 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:54.434 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:49:54.434 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:54 compute-1 nova_compute[238822]: 2025-09-30 18:49:54.496 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:49:54 compute-1 ceph-mon[75484]: pgmap v2147: 353 pgs: 353 active+clean; 200 MiB data, 488 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 2.3 MiB/s wr, 137 op/s
Sep 30 18:49:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:55.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:55 compute-1 nova_compute[238822]: 2025-09-30 18:49:55.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:49:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:56.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:49:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:56 compute-1 ceph-mon[75484]: pgmap v2148: 353 pgs: 353 active+clean; 200 MiB data, 488 MiB used, 40 GiB / 40 GiB avail; 243 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:49:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:57.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:49:57 compute-1 nova_compute[238822]: 2025-09-30 18:49:57.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3530658459' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:49:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3530658459' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:49:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:49:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4283978356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:49:58.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:58 compute-1 nova_compute[238822]: 2025-09-30 18:49:58.087 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:58 compute-1 podman[304669]: 2025-09-30 18:49:58.566258409 +0000 UTC m=+0.100799056 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid)
Sep 30 18:49:58 compute-1 podman[304670]: 2025-09-30 18:49:58.571768897 +0000 UTC m=+0.101307980 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 18:49:58 compute-1 podman[304671]: 2025-09-30 18:49:58.600045376 +0000 UTC m=+0.124499163 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:49:58 compute-1 nova_compute[238822]: 2025-09-30 18:49:58.835 2 DEBUG nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_10ptqj0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='67db1f19-3436-4e1e-bf63-266846e1380d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 18:49:58 compute-1 ceph-mon[75484]: pgmap v2149: 353 pgs: 353 active+clean; 200 MiB data, 488 MiB used, 40 GiB / 40 GiB avail; 243 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Sep 30 18:49:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4283978356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.137 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.138 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:49:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:49:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:49:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:49:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.390 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.392 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:49:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:49:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:49:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:49:59.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.430 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.431 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4430MB free_disk=39.90130615234375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.431 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.432 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.853 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.854 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:49:59 compute-1 nova_compute[238822]: 2025-09-30 18:49:59.855 2 DEBUG nova.network.neutron [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:49:59 compute-1 ceph-mon[75484]: pgmap v2150: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 244 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Sep 30 18:50:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:00.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.363 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.454 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Migration for instance 67db1f19-3436-4e1e-bf63-266846e1380d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.687 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.850 2 DEBUG nova.network.neutron [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Updating instance_info_cache with network_info: [{"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:50:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.961 2 INFO nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Updating resource usage from migration fea3269e-5d28-46ac-b65f-701e0a6ebefa
Sep 30 18:50:00 compute-1 nova_compute[238822]: 2025-09-30 18:50:00.962 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Starting to track incoming migration fea3269e-5d28-46ac-b65f-701e0a6ebefa with flavor c83dc7f1-0795-47db-adcb-fb90be11684a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 18:50:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.357 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.387 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_10ptqj0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='67db1f19-3436-4e1e-bf63-266846e1380d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.388 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Creating instance directory: /var/lib/nova/instances/67db1f19-3436-4e1e-bf63-266846e1380d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.388 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Ensure instance console log exists: /var/lib/nova/instances/67db1f19-3436-4e1e-bf63-266846e1380d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.389 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.391 2 DEBUG nova.virt.libvirt.vif [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T18:48:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-983808530',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-983808530',id=38,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:49:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-wltr50iq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:49:16Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=67db1f19-3436-4e1e-bf63-266846e1380d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.391 2 DEBUG nova.network.os_vif_util [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converting VIF {"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.392 2 DEBUG nova.network.os_vif_util [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.393 2 DEBUG os_vif [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.394 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.395 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.397 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '94ffd401-e8a1-59ae-b94d-40c8e86307d3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.412 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dab1b31-af, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.412 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2dab1b31-af, col_values=(('qos', UUID('a3bf9ec9-4002-445f-894f-1ff0421b5e43')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.413 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2dab1b31-af, col_values=(('external_ids', {'iface-id': '2dab1b31-affc-4fc3-9d5e-698d2cd44d6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:d1:75', 'vm-uuid': '67db1f19-3436-4e1e-bf63-266846e1380d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 NetworkManager[45549]: <info>  [1759258201.4171] manager: (tap2dab1b31-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:50:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:01.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.427 2 INFO os_vif [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af')
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.428 2 DEBUG nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.428 2 DEBUG nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_10ptqj0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='67db1f19-3436-4e1e-bf63-266846e1380d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.429 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.495 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 838798ef-0563-40a9-af50-22403624c69e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:50:01 compute-1 nova_compute[238822]: 2025-09-30 18:50:01.620 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1476484408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:01 compute-1 ceph-mon[75484]: pgmap v2151: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 243 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.044 2 WARNING nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 67db1f19-3436-4e1e-bf63-266846e1380d has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.045 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.045 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=39GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:49:59 up  4:27,  0 user,  load average: 0.39, 0.39, 0.48\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3359c464e0344756a39ce5c7088b9eba': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:50:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:02.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.106 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:50:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.270 2 DEBUG nova.network.neutron [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Port 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.289 2 DEBUG nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=37888,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_10ptqj0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='67db1f19-3436-4e1e-bf63-266846e1380d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 18:50:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:50:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2857037570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:02 compute-1 sudo[304755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:50:02 compute-1 sudo[304755]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.600 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:50:02 compute-1 sudo[304755]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:02 compute-1 nova_compute[238822]: 2025-09-30 18:50:02.609 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:50:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2857037570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:03 compute-1 nova_compute[238822]: 2025-09-30 18:50:03.119 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:03.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:03 compute-1 nova_compute[238822]: 2025-09-30 18:50:03.632 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:50:03 compute-1 nova_compute[238822]: 2025-09-30 18:50:03.632 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.200s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:04.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:04 compute-1 nova_compute[238822]: 2025-09-30 18:50:04.633 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:05 compute-1 ceph-mon[75484]: pgmap v2152: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 244 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Sep 30 18:50:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3710904971' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.145 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.146 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.147 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.147 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.147 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:50:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:05 compute-1 kernel: tap2dab1b31-af: entered promiscuous mode
Sep 30 18:50:05 compute-1 NetworkManager[45549]: <info>  [1759258205.2290] manager: (tap2dab1b31-af): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Sep 30 18:50:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:05 compute-1 ovn_controller[135204]: 2025-09-30T18:50:05Z|00303|binding|INFO|Claiming lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e for this additional chassis.
Sep 30 18:50:05 compute-1 ovn_controller[135204]: 2025-09-30T18:50:05Z|00304|binding|INFO|2dab1b31-affc-4fc3-9d5e-698d2cd44d6e: Claiming fa:16:3e:2d:d1:75 10.100.0.6
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.282 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:d1:75 10.100.0.6'], port_security=['fa:16:3e:2d:d1:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '67db1f19-3436-4e1e-bf63-266846e1380d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.283 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.285 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:50:05 compute-1 ovn_controller[135204]: 2025-09-30T18:50:05Z|00305|binding|INFO|Setting lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e ovn-installed in OVS
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:05 compute-1 systemd-machined[195911]: New machine qemu-29-instance-00000026.
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.309 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[768f5251-9930-44a4-8523-589f7102ce36]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 systemd[1]: Started Virtual Machine qemu-29-instance-00000026.
Sep 30 18:50:05 compute-1 systemd-udevd[304801]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:50:05 compute-1 NetworkManager[45549]: <info>  [1759258205.3494] device (tap2dab1b31-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:50:05 compute-1 NetworkManager[45549]: <info>  [1759258205.3514] device (tap2dab1b31-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.354 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[a388d2a4-930b-4dba-ba42-b1cd66403c53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.357 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c3437144-687f-4d15-a5e8-44e36f545c5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.395 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5a6538-d4d0-465f-9c90-e1a871774ad6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.423 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7bf8e7-4587-40c3-a146-51a6bc6c5289]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1601960, 'reachable_time': 16822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304811, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:05.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.448 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[975dfc35-6657-4529-9495-6ba212e7da84]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1601977, 'tstamp': 1601977}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304812, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1601982, 'tstamp': 1601982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304812, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.450 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.455 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.455 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.456 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.456 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:50:05 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:05.458 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a7e8d3-3e77-48fd-879e-dc45bba8aa53]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:05 compute-1 nova_compute[238822]: 2025-09-30 18:50:05.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:05 compute-1 podman[249638]: time="2025-09-30T18:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:50:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:50:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8837 "" "Go-http-client/1.1"
Sep 30 18:50:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:06.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:06 compute-1 nova_compute[238822]: 2025-09-30 18:50:06.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:07 compute-1 ceph-mon[75484]: pgmap v2153: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:50:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:07.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:50:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:08.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:08.354 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:50:08 compute-1 nova_compute[238822]: 2025-09-30 18:50:08.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:08 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:08.356 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:50:08 compute-1 ovn_controller[135204]: 2025-09-30T18:50:08Z|00306|binding|INFO|Claiming lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e for this chassis.
Sep 30 18:50:08 compute-1 ovn_controller[135204]: 2025-09-30T18:50:08Z|00307|binding|INFO|2dab1b31-affc-4fc3-9d5e-698d2cd44d6e: Claiming fa:16:3e:2d:d1:75 10.100.0.6
Sep 30 18:50:08 compute-1 ovn_controller[135204]: 2025-09-30T18:50:08Z|00308|binding|INFO|Setting lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e up in Southbound
Sep 30 18:50:08 compute-1 nova_compute[238822]: 2025-09-30 18:50:08.567 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:08 compute-1 sshd-session[304858]: Invalid user 24online from 103.153.190.105 port 47906
Sep 30 18:50:08 compute-1 sshd-session[304858]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:50:08 compute-1 sshd-session[304858]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:50:09 compute-1 ceph-mon[75484]: pgmap v2154: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 1.2 KiB/s rd, 15 KiB/s wr, 1 op/s
Sep 30 18:50:09 compute-1 unix_chkpwd[304865]: password check failed for user (root)
Sep 30 18:50:09 compute-1 sshd-session[304862]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17  user=root
Sep 30 18:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:09.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.653 2 INFO nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Post operation of migration started
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.655 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.790 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.791 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.874 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.874 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:50:09 compute-1 nova_compute[238822]: 2025-09-30 18:50:09.875 2 DEBUG nova.network.neutron [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:50:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:10 compute-1 sshd-session[304858]: Failed password for invalid user 24online from 103.153.190.105 port 47906 ssh2
Sep 30 18:50:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:10 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:10.358 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:10 compute-1 nova_compute[238822]: 2025-09-30 18:50:10.382 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:10 compute-1 nova_compute[238822]: 2025-09-30 18:50:10.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:10 compute-1 sshd-session[304862]: Failed password for root from 161.132.50.17 port 53886 ssh2
Sep 30 18:50:11 compute-1 nova_compute[238822]: 2025-09-30 18:50:11.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:11 compute-1 ceph-mon[75484]: pgmap v2155: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 5.8 KiB/s rd, 15 KiB/s wr, 7 op/s
Sep 30 18:50:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:11 compute-1 sshd-session[304869]: Invalid user opus from 8.243.64.201 port 40714
Sep 30 18:50:11 compute-1 sshd-session[304869]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:50:11 compute-1 sshd-session[304869]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:50:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:11.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:11 compute-1 nova_compute[238822]: 2025-09-30 18:50:11.447 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:11 compute-1 sshd-session[304858]: Received disconnect from 103.153.190.105 port 47906:11: Bye Bye [preauth]
Sep 30 18:50:11 compute-1 sshd-session[304858]: Disconnected from invalid user 24online 103.153.190.105 port 47906 [preauth]
Sep 30 18:50:11 compute-1 nova_compute[238822]: 2025-09-30 18:50:11.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:11 compute-1 nova_compute[238822]: 2025-09-30 18:50:11.677 2 DEBUG nova.network.neutron [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Updating instance_info_cache with network_info: [{"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:50:11 compute-1 sshd-session[304862]: Received disconnect from 161.132.50.17 port 53886:11: Bye Bye [preauth]
Sep 30 18:50:11 compute-1 sshd-session[304862]: Disconnected from authenticating user root 161.132.50.17 port 53886 [preauth]
Sep 30 18:50:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:12.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:12 compute-1 unix_chkpwd[304872]: password check failed for user (root)
Sep 30 18:50:12 compute-1 sshd-session[304867]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:50:12 compute-1 nova_compute[238822]: 2025-09-30 18:50:12.188 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-67db1f19-3436-4e1e-bf63-266846e1380d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:12 compute-1 sshd-session[304869]: Failed password for invalid user opus from 8.243.64.201 port 40714 ssh2
Sep 30 18:50:12 compute-1 nova_compute[238822]: 2025-09-30 18:50:12.718 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:12 compute-1 nova_compute[238822]: 2025-09-30 18:50:12.719 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:12 compute-1 nova_compute[238822]: 2025-09-30 18:50:12.719 2 DEBUG oslo_concurrency.lockutils [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:12 compute-1 nova_compute[238822]: 2025-09-30 18:50:12.728 2 INFO nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 18:50:12 compute-1 virtqemud[239124]: Domain id=29 name='instance-00000026' uuid=67db1f19-3436-4e1e-bf63-266846e1380d is tainted: custom-monitor
Sep 30 18:50:13 compute-1 ceph-mon[75484]: pgmap v2156: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 4.9 KiB/s rd, 0 B/s wr, 5 op/s
Sep 30 18:50:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:13 compute-1 sshd-session[304869]: Received disconnect from 8.243.64.201 port 40714:11: Bye Bye [preauth]
Sep 30 18:50:13 compute-1 sshd-session[304869]: Disconnected from invalid user opus 8.243.64.201 port 40714 [preauth]
Sep 30 18:50:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:13.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:13 compute-1 nova_compute[238822]: 2025-09-30 18:50:13.741 2 INFO nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 18:50:13 compute-1 sshd-session[304867]: Failed password for root from 192.210.160.141 port 39466 ssh2
Sep 30 18:50:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:14.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:14 compute-1 nova_compute[238822]: 2025-09-30 18:50:14.751 2 INFO nova.virt.libvirt.driver [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 18:50:14 compute-1 nova_compute[238822]: 2025-09-30 18:50:14.757 2 DEBUG nova.compute.manager [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:50:15 compute-1 ceph-mon[75484]: pgmap v2157: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1023 B/s wr, 19 op/s
Sep 30 18:50:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:15 compute-1 sshd-session[304867]: Connection closed by authenticating user root 192.210.160.141 port 39466 [preauth]
Sep 30 18:50:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:15 compute-1 nova_compute[238822]: 2025-09-30 18:50:15.270 2 DEBUG nova.objects.instance [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 18:50:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:15.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:15 compute-1 nova_compute[238822]: 2025-09-30 18:50:15.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:16 compute-1 nova_compute[238822]: 2025-09-30 18:50:16.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:16.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:16 compute-1 nova_compute[238822]: 2025-09-30 18:50:16.292 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:16 compute-1 nova_compute[238822]: 2025-09-30 18:50:16.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:17 compute-1 ceph-mon[75484]: pgmap v2158: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1023 B/s wr, 19 op/s
Sep 30 18:50:17 compute-1 nova_compute[238822]: 2025-09-30 18:50:17.139 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:17 compute-1 nova_compute[238822]: 2025-09-30 18:50:17.140 2 WARNING neutronclient.v2_0.client [None req-e39aaae2-dba1-4794-ad95-6450eb5961cb 23f48a6ade4f417e868e98b2d7f359bb faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:17.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:18.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:19 compute-1 nova_compute[238822]: 2025-09-30 18:50:19.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:19 compute-1 nova_compute[238822]: 2025-09-30 18:50:19.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:50:19 compute-1 ceph-mon[75484]: pgmap v2159: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1023 B/s wr, 19 op/s
Sep 30 18:50:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1831620129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: ERROR   18:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: ERROR   18:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: ERROR   18:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: ERROR   18:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: ERROR   18:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:50:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:50:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:19.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:19 compute-1 nova_compute[238822]: 2025-09-30 18:50:19.565 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:19 compute-1 podman[304885]: 2025-09-30 18:50:19.574586285 +0000 UTC m=+0.103808047 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:50:19 compute-1 podman[304884]: 2025-09-30 18:50:19.621857384 +0000 UTC m=+0.151036135 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Sep 30 18:50:20 compute-1 sshd-session[304881]: Invalid user h from 49.49.32.245 port 46750
Sep 30 18:50:20 compute-1 sshd-session[304881]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:50:20 compute-1 sshd-session[304881]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:50:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:20.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.138 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.140 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.140 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.141 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.141 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.158 2 INFO nova.compute.manager [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Terminating instance
Sep 30 18:50:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.681 2 DEBUG nova.compute.manager [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:50:20 compute-1 kernel: tap7f72ad94-f1 (unregistering): left promiscuous mode
Sep 30 18:50:20 compute-1 NetworkManager[45549]: <info>  [1759258220.7467] device (tap7f72ad94-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 ovn_controller[135204]: 2025-09-30T18:50:20Z|00309|binding|INFO|Releasing lport 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 from this chassis (sb_readonly=0)
Sep 30 18:50:20 compute-1 ovn_controller[135204]: 2025-09-30T18:50:20Z|00310|binding|INFO|Setting lport 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 down in Southbound
Sep 30 18:50:20 compute-1 ovn_controller[135204]: 2025-09-30T18:50:20Z|00311|binding|INFO|Removing iface tap7f72ad94-f1 ovn-installed in OVS
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.768 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:4f:65 10.100.0.5'], port_security=['fa:16:3e:de:4f:65 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '838798ef-0563-40a9-af50-22403624c69e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.769 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.772 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4658d55-a8f9-48f1-846d-61df3d830821
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.799 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[438e4a39-5e75-4e3b-98d3-27d891cf9c27]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000027.scope: Deactivated successfully.
Sep 30 18:50:20 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000027.scope: Consumed 16.402s CPU time.
Sep 30 18:50:20 compute-1 systemd-machined[195911]: Machine qemu-28-instance-00000027 terminated.
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.855 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[706b9c12-c049-4aa3-8c05-30dfdd25d8a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.859 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c69ba637-5664-4d17-814f-e6a6d7437638]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.907 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcbc53b-0897-44db-b9a8-119c06eaf569]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.936 2 INFO nova.virt.libvirt.driver [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] Instance destroyed successfully.
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.937 2 DEBUG nova.objects.instance [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'resources' on Instance uuid 838798ef-0563-40a9-af50-22403624c69e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.952 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[07af5de1-20d9-4164-915d-eeb9e4fb859a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4658d55-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:a8:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1601960, 'reachable_time': 16822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304950, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.979 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e9973759-4423-454c-9674-bd9f2314c57a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1601977, 'tstamp': 1601977}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304958, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4658d55-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1601982, 'tstamp': 1601982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304958, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.981 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 nova_compute[238822]: 2025-09-30 18:50:20.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.990 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4658d55-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.990 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.991 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4658d55-a0, col_values=(('external_ids', {'iface-id': '862fbe9e-132a-4b8a-83f6-7b020c6192ad'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.991 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:50:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:20.993 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8918c6cb-fadf-4e1c-be8a-de4f125ae4bd]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f4658d55-a8f9-48f1-846d-61df3d830821\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f4658d55-a8f9-48f1-846d-61df3d830821\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:21 compute-1 ceph-mon[75484]: pgmap v2160: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 77 KiB/s rd, 9.1 KiB/s wr, 127 op/s
Sep 30 18:50:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.252 2 DEBUG nova.compute.manager [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.252 2 DEBUG oslo_concurrency.lockutils [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.253 2 DEBUG oslo_concurrency.lockutils [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.253 2 DEBUG oslo_concurrency.lockutils [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.253 2 DEBUG nova.compute.manager [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] No waiting events found dispatching network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.254 2 DEBUG nova.compute.manager [req-5f8708dd-b8b2-46e9-abd3-7cf231e7f4eb req-4c297820-7243-4f1c-b537-7c27b16ab40e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.445 2 DEBUG nova.virt.libvirt.vif [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:49:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1073933630',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1073933630',id=39,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:49:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-a039zebv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:49:40Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=838798ef-0563-40a9-af50-22403624c69e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.446 2 DEBUG nova.network.os_vif_util [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "address": "fa:16:3e:de:4f:65", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f72ad94-f1", "ovs_interfaceid": "7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.447 2 DEBUG nova.network.os_vif_util [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.448 2 DEBUG os_vif [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.451 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f72ad94-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:21.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.486 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=001c7758-181e-45ae-ada6-cf83d1acd8a0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.492 2 INFO os_vif [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:4f:65,bridge_name='br-int',has_traffic_filtering=True,id=7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f72ad94-f1')
Sep 30 18:50:21 compute-1 sshd-session[304881]: Failed password for invalid user h from 49.49.32.245 port 46750 ssh2
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.971 2 INFO nova.virt.libvirt.driver [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Deleting instance files /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e_del
Sep 30 18:50:21 compute-1 nova_compute[238822]: 2025-09-30 18:50:21.973 2 INFO nova.virt.libvirt.driver [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Deletion of /var/lib/nova/instances/838798ef-0563-40a9-af50-22403624c69e_del complete
Sep 30 18:50:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:22.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:22 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3968907927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:22 compute-1 nova_compute[238822]: 2025-09-30 18:50:22.487 2 INFO nova.compute.manager [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Took 1.81 seconds to destroy the instance on the hypervisor.
Sep 30 18:50:22 compute-1 nova_compute[238822]: 2025-09-30 18:50:22.488 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:50:22 compute-1 nova_compute[238822]: 2025-09-30 18:50:22.489 2 DEBUG nova.compute.manager [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:50:22 compute-1 nova_compute[238822]: 2025-09-30 18:50:22.489 2 DEBUG nova.network.neutron [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:50:22 compute-1 nova_compute[238822]: 2025-09-30 18:50:22.489 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:22 compute-1 sudo[304980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:50:22 compute-1 sudo[304980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:22 compute-1 sudo[304980]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:22 compute-1 podman[305004]: 2025-09-30 18:50:22.845110675 +0000 UTC m=+0.090540511 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 18:50:22 compute-1 sshd-session[304881]: Received disconnect from 49.49.32.245 port 46750:11: Bye Bye [preauth]
Sep 30 18:50:22 compute-1 sshd-session[304881]: Disconnected from invalid user h 49.49.32.245 port 46750 [preauth]
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.120 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:23 compute-1 ceph-mon[75484]: pgmap v2161: 353 pgs: 353 active+clean; 200 MiB data, 489 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 9.1 KiB/s wr, 121 op/s
Sep 30 18:50:23 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:50:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.322 2 DEBUG nova.compute.manager [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.322 2 DEBUG oslo_concurrency.lockutils [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "838798ef-0563-40a9-af50-22403624c69e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.322 2 DEBUG oslo_concurrency.lockutils [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.323 2 DEBUG oslo_concurrency.lockutils [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.323 2 DEBUG nova.compute.manager [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] No waiting events found dispatching network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.323 2 DEBUG nova.compute.manager [req-33ae7d28-014e-4a60-92f2-7d551c562190 req-fa5b12e9-5b96-4971-aef7-95293df05f29 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-unplugged-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:50:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:23.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:23 compute-1 nova_compute[238822]: 2025-09-30 18:50:23.901 2 DEBUG nova.network.neutron [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:50:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:24.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:24 compute-1 nova_compute[238822]: 2025-09-30 18:50:24.409 2 INFO nova.compute.manager [-] [instance: 838798ef-0563-40a9-af50-22403624c69e] Took 1.92 seconds to deallocate network for instance.
Sep 30 18:50:24 compute-1 nova_compute[238822]: 2025-09-30 18:50:24.933 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:24 compute-1 nova_compute[238822]: 2025-09-30 18:50:24.933 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:25 compute-1 nova_compute[238822]: 2025-09-30 18:50:25.009 2 DEBUG oslo_concurrency.processutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:50:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:25 compute-1 ceph-mon[75484]: pgmap v2162: 353 pgs: 353 active+clean; 121 MiB data, 444 MiB used, 40 GiB / 40 GiB avail; 91 KiB/s rd, 10 KiB/s wr, 149 op/s
Sep 30 18:50:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:25 compute-1 nova_compute[238822]: 2025-09-30 18:50:25.406 2 DEBUG nova.compute.manager [req-6ed90391-ac67-4a02-a758-f8febf066cc9 req-4db4518e-5fc2-47df-9e19-87bfb50a3f3c 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 838798ef-0563-40a9-af50-22403624c69e] Received event network-vif-deleted-7f72ad94-f1e0-4c7c-8c0e-95cbe1319d12 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:25.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:25 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:50:25 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/268221078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:25 compute-1 nova_compute[238822]: 2025-09-30 18:50:25.525 2 DEBUG oslo_concurrency.processutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:50:25 compute-1 nova_compute[238822]: 2025-09-30 18:50:25.536 2 DEBUG nova.compute.provider_tree [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:50:25 compute-1 nova_compute[238822]: 2025-09-30 18:50:25.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:26 compute-1 nova_compute[238822]: 2025-09-30 18:50:26.051 2 DEBUG nova.scheduler.client.report [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:50:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:26.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/268221078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:50:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:26 compute-1 nova_compute[238822]: 2025-09-30 18:50:26.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:26 compute-1 nova_compute[238822]: 2025-09-30 18:50:26.567 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.633s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:26 compute-1 nova_compute[238822]: 2025-09-30 18:50:26.595 2 INFO nova.scheduler.client.report [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Deleted allocations for instance 838798ef-0563-40a9-af50-22403624c69e
Sep 30 18:50:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:27 compute-1 ceph-mon[75484]: pgmap v2163: 353 pgs: 353 active+clean; 121 MiB data, 444 MiB used, 40 GiB / 40 GiB avail; 83 KiB/s rd, 9.2 KiB/s wr, 135 op/s
Sep 30 18:50:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:27.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:27 compute-1 nova_compute[238822]: 2025-09-30 18:50:27.631 2 DEBUG oslo_concurrency.lockutils [None req-4d0b1655-7039-42fb-bad1-f64b7528d1b4 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "838798ef-0563-40a9-af50-22403624c69e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.491s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:28.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.371 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "67db1f19-3436-4e1e-bf63-266846e1380d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.372 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.373 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.373 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.373 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.390 2 INFO nova.compute.manager [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Terminating instance
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.913 2 DEBUG nova.compute.manager [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:50:28 compute-1 kernel: tap2dab1b31-af (unregistering): left promiscuous mode
Sep 30 18:50:28 compute-1 NetworkManager[45549]: <info>  [1759258228.9834] device (tap2dab1b31-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:28 compute-1 ovn_controller[135204]: 2025-09-30T18:50:28Z|00312|binding|INFO|Releasing lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e from this chassis (sb_readonly=0)
Sep 30 18:50:28 compute-1 ovn_controller[135204]: 2025-09-30T18:50:28Z|00313|binding|INFO|Setting lport 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e down in Southbound
Sep 30 18:50:28 compute-1 ovn_controller[135204]: 2025-09-30T18:50:28Z|00314|binding|INFO|Removing iface tap2dab1b31-af ovn-installed in OVS
Sep 30 18:50:28 compute-1 nova_compute[238822]: 2025-09-30 18:50:28.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.003 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:d1:75 10.100.0.6'], port_security=['fa:16:3e:2d:d1:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '67db1f19-3436-4e1e-bf63-266846e1380d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4658d55-a8f9-48f1-846d-61df3d830821', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3359c464e0344756a39ce5c7088b9eba', 'neutron:revision_number': '15', 'neutron:security_group_ids': '3a57c776-d79c-4096-859e-411dcf78cfa5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0884332-fe68-47c8-9c8c-5c6a7c53f7f5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.005 144543 INFO neutron.agent.ovn.metadata.agent [-] Port 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e in datapath f4658d55-a8f9-48f1-846d-61df3d830821 unbound from our chassis
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.006 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4658d55-a8f9-48f1-846d-61df3d830821, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.008 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc18d5d-9ac2-4157-b4c6-8bb2e133a7e3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.009 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 namespace which is not needed anymore
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000026.scope: Deactivated successfully.
Sep 30 18:50:29 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000026.scope: Consumed 3.178s CPU time.
Sep 30 18:50:29 compute-1 systemd-machined[195911]: Machine qemu-29-instance-00000026 terminated.
Sep 30 18:50:29 compute-1 podman[305056]: 2025-09-30 18:50:29.162187126 +0000 UTC m=+0.115422069 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:50:29 compute-1 podman[305060]: 2025-09-30 18:50:29.164338464 +0000 UTC m=+0.112929702 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.169 2 INFO nova.virt.libvirt.driver [-] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Instance destroyed successfully.
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.170 2 DEBUG nova.objects.instance [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lazy-loading 'resources' on Instance uuid 67db1f19-3436-4e1e-bf63-266846e1380d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:50:29 compute-1 podman[305062]: 2025-09-30 18:50:29.180118178 +0000 UTC m=+0.125141620 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2)
Sep 30 18:50:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:29 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [NOTICE]   (304343) : haproxy version is 3.0.5-8e879a5
Sep 30 18:50:29 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [NOTICE]   (304343) : path to executable is /usr/sbin/haproxy
Sep 30 18:50:29 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [WARNING]  (304343) : Exiting Master process...
Sep 30 18:50:29 compute-1 podman[305121]: 2025-09-30 18:50:29.19474413 +0000 UTC m=+0.047498416 container kill 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:50:29 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [ALERT]    (304343) : Current worker (304345) exited with code 143 (Terminated)
Sep 30 18:50:29 compute-1 neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821[304331]: [WARNING]  (304343) : All workers exited. Exiting... (0)
Sep 30 18:50:29 compute-1 systemd[1]: libpod-21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53.scope: Deactivated successfully.
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.198 2 DEBUG nova.compute.manager [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Received event network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.199 2 DEBUG oslo_concurrency.lockutils [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.199 2 DEBUG oslo_concurrency.lockutils [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.199 2 DEBUG oslo_concurrency.lockutils [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.199 2 DEBUG nova.compute.manager [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] No waiting events found dispatching network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.200 2 DEBUG nova.compute.manager [req-47661e57-b608-4554-8419-800f56e4cb18 req-561682b8-b222-4fb8-8adf-a5b73af5b35e 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Received event network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:50:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:29 compute-1 podman[305160]: 2025-09-30 18:50:29.246739736 +0000 UTC m=+0.030956092 container died 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:50:29 compute-1 ceph-mon[75484]: pgmap v2164: 353 pgs: 353 active+clean; 121 MiB data, 444 MiB used, 40 GiB / 40 GiB avail; 83 KiB/s rd, 9.2 KiB/s wr, 135 op/s
Sep 30 18:50:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53-userdata-shm.mount: Deactivated successfully.
Sep 30 18:50:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-826e994b845d6dcf2d190a19fd3161ef15c374b8ba9a892cfaf7ef4004bfe95d-merged.mount: Deactivated successfully.
Sep 30 18:50:29 compute-1 podman[305160]: 2025-09-30 18:50:29.306232503 +0000 UTC m=+0.090448879 container cleanup 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:50:29 compute-1 systemd[1]: libpod-conmon-21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53.scope: Deactivated successfully.
Sep 30 18:50:29 compute-1 podman[305162]: 2025-09-30 18:50:29.339603818 +0000 UTC m=+0.110227569 container remove 21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.349 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1e0e0d-f9ff-4575-b560-859ff8217566]: (4, ("Tue Sep 30 06:50:29 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 (21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53)\n21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53\nTue Sep 30 06:50:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 (21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53)\n21f77a29c5547094782163aa722b362a8f1d7431cd1d63645f0d91cb8dda0c53\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.352 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cdfae8-780d-42ed-99c8-1b1b11350bf5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.352 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4658d55-a8f9-48f1-846d-61df3d830821.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.353 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a62010de-67db-459b-abb3-ccc480029eb7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.354 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4658d55-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 kernel: tapf4658d55-a0: left promiscuous mode
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.392 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2eca46a9-ed44-43d9-95c3-a97af20b4ce1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.419 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[453efc07-cc44-461b-a425-8fd9890b80a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.421 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[e3922e20-9e4c-42a9-85b4-be295fafc1c1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.447 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[907840be-d1c5-47f5-970f-d1152250fddc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1601949, 'reachable_time': 16385, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305195, 'error': None, 'target': 'ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 systemd[1]: run-netns-ovnmeta\x2df4658d55\x2da8f9\x2d48f1\x2d846d\x2d61df3d830821.mount: Deactivated successfully.
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.452 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4658d55-a8f9-48f1-846d-61df3d830821 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:50:29 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:29.454 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[43106555-390d-4845-8697-9655c29ca947]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:50:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:29.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.677 2 DEBUG nova.virt.libvirt.vif [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T18:48:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-983808530',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-983808530',id=38,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:49:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3359c464e0344756a39ce5c7088b9eba',ramdisk_id='',reservation_id='r-wltr50iq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',clean_attempts='1',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-613400940',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-613400940-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:50:15Z,user_data=None,user_id='f560266d133f4f1ba4a908e3cdcfa59d',uuid=67db1f19-3436-4e1e-bf63-266846e1380d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.678 2 DEBUG nova.network.os_vif_util [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converting VIF {"id": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "address": "fa:16:3e:2d:d1:75", "network": {"id": "f4658d55-a8f9-48f1-846d-61df3d830821", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2093820932-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67cbb3b670e445a4b97abcc92749d126", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dab1b31-af", "ovs_interfaceid": "2dab1b31-affc-4fc3-9d5e-698d2cd44d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.679 2 DEBUG nova.network.os_vif_util [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.679 2 DEBUG os_vif [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.683 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dab1b31-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.688 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a3bf9ec9-4002-445f-894f-1ff0421b5e43) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:29 compute-1 nova_compute[238822]: 2025-09-30 18:50:29.694 2 INFO os_vif [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:d1:75,bridge_name='br-int',has_traffic_filtering=True,id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e,network=Network(f4658d55-a8f9-48f1-846d-61df3d830821),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dab1b31-af')
Sep 30 18:50:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.188 2 INFO nova.virt.libvirt.driver [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Deleting instance files /var/lib/nova/instances/67db1f19-3436-4e1e-bf63-266846e1380d_del
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.190 2 INFO nova.virt.libvirt.driver [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Deletion of /var/lib/nova/instances/67db1f19-3436-4e1e-bf63-266846e1380d_del complete
Sep 30 18:50:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.707 2 INFO nova.compute.manager [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Took 1.79 seconds to destroy the instance on the hypervisor.
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.707 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.708 2 DEBUG nova.compute.manager [-] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.708 2 DEBUG nova.network.neutron [-] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:50:30 compute-1 nova_compute[238822]: 2025-09-30 18:50:30.709 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.128 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:50:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:31 compute-1 ceph-mon[75484]: pgmap v2165: 353 pgs: 353 active+clean; 121 MiB data, 444 MiB used, 40 GiB / 40 GiB avail; 83 KiB/s rd, 9.2 KiB/s wr, 136 op/s
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.294 2 DEBUG nova.compute.manager [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Received event network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.295 2 DEBUG oslo_concurrency.lockutils [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.295 2 DEBUG oslo_concurrency.lockutils [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.296 2 DEBUG oslo_concurrency.lockutils [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.296 2 DEBUG nova.compute.manager [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] No waiting events found dispatching network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.296 2 DEBUG nova.compute.manager [req-a666020c-5cfa-4672-82ea-320319bb1492 req-d551b3b8-5736-4ccb-b439-985f230f0ea3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Received event network-vif-unplugged-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:50:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:50:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.851 2 DEBUG nova.compute.manager [req-e6a2129f-73a3-4bdc-8e8c-89b85e32b836 req-d4d4a19a-d097-4a06-8a91-d2845d72972b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Received event network-vif-deleted-2dab1b31-affc-4fc3-9d5e-698d2cd44d6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.852 2 INFO nova.compute.manager [req-e6a2129f-73a3-4bdc-8e8c-89b85e32b836 req-d4d4a19a-d097-4a06-8a91-d2845d72972b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Neutron deleted interface 2dab1b31-affc-4fc3-9d5e-698d2cd44d6e; detaching it from the instance and deleting it from the info cache
Sep 30 18:50:31 compute-1 nova_compute[238822]: 2025-09-30 18:50:31.852 2 DEBUG nova.network.neutron [req-e6a2129f-73a3-4bdc-8e8c-89b85e32b836 req-d4d4a19a-d097-4a06-8a91-d2845d72972b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:50:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:32 compute-1 nova_compute[238822]: 2025-09-30 18:50:32.295 2 DEBUG nova.network.neutron [-] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:50:32 compute-1 nova_compute[238822]: 2025-09-30 18:50:32.360 2 DEBUG nova.compute.manager [req-e6a2129f-73a3-4bdc-8e8c-89b85e32b836 req-d4d4a19a-d097-4a06-8a91-d2845d72972b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Detach interface failed, port_id=2dab1b31-affc-4fc3-9d5e-698d2cd44d6e, reason: Instance 67db1f19-3436-4e1e-bf63-266846e1380d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:50:32 compute-1 nova_compute[238822]: 2025-09-30 18:50:32.804 2 INFO nova.compute.manager [-] [instance: 67db1f19-3436-4e1e-bf63-266846e1380d] Took 2.10 seconds to deallocate network for instance.
Sep 30 18:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:33 compute-1 ceph-mon[75484]: pgmap v2166: 353 pgs: 353 active+clean; 121 MiB data, 444 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:33 compute-1 nova_compute[238822]: 2025-09-30 18:50:33.331 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:33 compute-1 nova_compute[238822]: 2025-09-30 18:50:33.332 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:33 compute-1 nova_compute[238822]: 2025-09-30 18:50:33.340 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:33 compute-1 nova_compute[238822]: 2025-09-30 18:50:33.371 2 INFO nova.scheduler.client.report [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Deleted allocations for instance 67db1f19-3436-4e1e-bf63-266846e1380d
Sep 30 18:50:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:33.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:34.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:34 compute-1 nova_compute[238822]: 2025-09-30 18:50:34.408 2 DEBUG oslo_concurrency.lockutils [None req-99e24c52-2d40-44b0-9adb-edfdee39d814 f560266d133f4f1ba4a908e3cdcfa59d 3359c464e0344756a39ce5c7088b9eba - - default default] Lock "67db1f19-3436-4e1e-bf63-266846e1380d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.036s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:34 compute-1 nova_compute[238822]: 2025-09-30 18:50:34.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:35 compute-1 ceph-mon[75484]: pgmap v2167: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 56 op/s
Sep 30 18:50:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:35.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:35 compute-1 nova_compute[238822]: 2025-09-30 18:50:35.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:35 compute-1 podman[249638]: time="2025-09-30T18:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:50:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:50:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8373 "" "Go-http-client/1.1"
Sep 30 18:50:36 compute-1 nova_compute[238822]: 2025-09-30 18:50:36.071 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:36 compute-1 nova_compute[238822]: 2025-09-30 18:50:36.072 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:50:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:36.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:36 compute-1 nova_compute[238822]: 2025-09-30 18:50:36.580 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:50:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:37 compute-1 ceph-mon[75484]: pgmap v2168: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2920653784' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:50:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2920653784' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:50:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:37.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:50:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:38.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:50:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:50:39 compute-1 unix_chkpwd[305227]: password check failed for user (root)
Sep 30 18:50:39 compute-1 sshd-session[305222]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:39 compute-1 ceph-mon[75484]: pgmap v2169: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:39.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:39 compute-1 nova_compute[238822]: 2025-09-30 18:50:39.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:40.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:40 compute-1 nova_compute[238822]: 2025-09-30 18:50:40.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:40 compute-1 nova_compute[238822]: 2025-09-30 18:50:40.852 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:40 compute-1 sshd-session[305222]: Failed password for root from 192.210.160.141 port 33332 ssh2
Sep 30 18:50:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:41 compute-1 ceph-mon[75484]: pgmap v2170: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:41.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:42.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:42 compute-1 sshd-session[305222]: Connection closed by authenticating user root 192.210.160.141 port 33332 [preauth]
Sep 30 18:50:42 compute-1 sudo[305231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:50:42 compute-1 sudo[305231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:42 compute-1 sudo[305231]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:43 compute-1 ceph-mon[75484]: pgmap v2171: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:43.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:44.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:44 compute-1 nova_compute[238822]: 2025-09-30 18:50:44.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:45 compute-1 ceph-mon[75484]: pgmap v2172: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Sep 30 18:50:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:45.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:45 compute-1 nova_compute[238822]: 2025-09-30 18:50:45.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:47 compute-1 ceph-mon[75484]: pgmap v2173: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:50:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:47.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:48.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:49 compute-1 sudo[305264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:50:49 compute-1 sudo[305264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:49 compute-1 sudo[305264]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:49 compute-1 sudo[305289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:50:49 compute-1 sudo[305289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:49 compute-1 ceph-mon[75484]: pgmap v2174: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: ERROR   18:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: ERROR   18:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: ERROR   18:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: ERROR   18:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: ERROR   18:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:50:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:50:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:49.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:49 compute-1 nova_compute[238822]: 2025-09-30 18:50:49.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:49 compute-1 sudo[305289]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:50.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:50:50 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:50:50 compute-1 nova_compute[238822]: 2025-09-30 18:50:50.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:50 compute-1 podman[305349]: 2025-09-30 18:50:50.572741916 +0000 UTC m=+0.097749205 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:50:50 compute-1 podman[305348]: 2025-09-30 18:50:50.635210463 +0000 UTC m=+0.160032247 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 18:50:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:51 compute-1 ceph-mon[75484]: pgmap v2175: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:50:51 compute-1 ceph-mon[75484]: pgmap v2176: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:50:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:51.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:52.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:50:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:53 compute-1 nova_compute[238822]: 2025-09-30 18:50:53.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:53 compute-1 ceph-mon[75484]: pgmap v2177: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:50:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:53.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:53 compute-1 podman[305400]: 2025-09-30 18:50:53.545823623 +0000 UTC m=+0.085915017 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:50:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:54.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:54.436 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:54.436 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:50:54.436 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:54 compute-1 nova_compute[238822]: 2025-09-30 18:50:54.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:54 compute-1 sudo[305421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:50:54 compute-1 sudo[305421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:50:54 compute-1 sudo[305421]: pam_unix(sudo:session): session closed for user root
Sep 30 18:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:50:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:55.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:50:55 compute-1 nova_compute[238822]: 2025-09-30 18:50:55.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:55 compute-1 ceph-mon[75484]: pgmap v2178: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:50:55 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:50:55 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:50:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:56.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:50:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:57.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:50:57 compute-1 ceph-mon[75484]: pgmap v2179: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:50:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/912244844' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:50:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/912244844' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:50:58 compute-1 nova_compute[238822]: 2025-09-30 18:50:58.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:58 compute-1 nova_compute[238822]: 2025-09-30 18:50:58.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:50:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:50:58.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:50:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:50:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:50:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:50:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:50:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:50:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:50:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:50:59.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.576 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:50:59 compute-1 podman[305452]: 2025-09-30 18:50:59.601351654 +0000 UTC m=+0.120582028 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 18:50:59 compute-1 podman[305453]: 2025-09-30 18:50:59.607822047 +0000 UTC m=+0.122093768 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 18:50:59 compute-1 podman[305451]: 2025-09-30 18:50:59.619595513 +0000 UTC m=+0.147865309 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 18:50:59 compute-1 nova_compute[238822]: 2025-09-30 18:50:59.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:50:59 compute-1 ceph-mon[75484]: pgmap v2180: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:51:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:51:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2988664038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.039 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:00.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.299 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.301 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.353 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.354 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4632MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.355 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.356 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:00 compute-1 nova_compute[238822]: 2025-09-30 18:51:00.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2988664038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:01.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.615 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.616 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:51:00 up  4:28,  0 user,  load average: 0.30, 0.37, 0.46\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.688 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.780 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.781 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.798 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.818 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:51:01 compute-1 ceph-mon[75484]: pgmap v2181: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 607 B/s rd, 0 op/s
Sep 30 18:51:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3634599146' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:01 compute-1 nova_compute[238822]: 2025-09-30 18:51:01.843 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:02.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:51:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2618615078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:02 compute-1 nova_compute[238822]: 2025-09-30 18:51:02.383 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:02 compute-1 nova_compute[238822]: 2025-09-30 18:51:02.392 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:51:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:02.509 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:48:c5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2790c8e9fb6a48debd443ac79e2e12ba', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3bc33b-b1e3-4a2f-8784-2e8238744730, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa) old=Port_Binding(mac=['fa:16:3e:28:48:c5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2790c8e9fb6a48debd443ac79e2e12ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:51:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:02.510 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa in datapath cee64377-b6b9-46f2-8d77-c7978d4cc7a0 updated
Sep 30 18:51:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:02.511 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee64377-b6b9-46f2-8d77-c7978d4cc7a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:51:02 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:02.512 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[437b7ad4-59fa-4f59-8729-bc53618b0de7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2618615078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:02 compute-1 nova_compute[238822]: 2025-09-30 18:51:02.903 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:51:02 compute-1 sudo[305560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:51:02 compute-1 sudo[305560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:02 compute-1 sudo[305560]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:03 compute-1 nova_compute[238822]: 2025-09-30 18:51:03.415 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:51:03 compute-1 nova_compute[238822]: 2025-09-30 18:51:03.416 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.060s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:03.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:03 compute-1 ceph-mon[75484]: pgmap v2182: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/653443428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:04.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:04 compute-1 nova_compute[238822]: 2025-09-30 18:51:04.416 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:51:04 compute-1 nova_compute[238822]: 2025-09-30 18:51:04.416 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:51:04 compute-1 nova_compute[238822]: 2025-09-30 18:51:04.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:04 compute-1 ceph-mon[75484]: pgmap v2183: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:51:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:05.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:05 compute-1 nova_compute[238822]: 2025-09-30 18:51:05.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:05 compute-1 podman[249638]: time="2025-09-30T18:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:51:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:51:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8370 "" "Go-http-client/1.1"
Sep 30 18:51:05 compute-1 unix_chkpwd[305590]: password check failed for user (root)
Sep 30 18:51:05 compute-1 sshd-session[305586]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:51:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:06.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:07 compute-1 ceph-mon[75484]: pgmap v2184: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:07 compute-1 sshd-session[305586]: Failed password for root from 192.210.160.141 port 36022 ssh2
Sep 30 18:51:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:07.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:51:08 compute-1 nova_compute[238822]: 2025-09-30 18:51:08.055 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:51:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:08 compute-1 sshd-session[305594]: Connection closed by 114.66.3.37 port 44378
Sep 30 18:51:08 compute-1 sshd-session[305586]: Connection closed by authenticating user root 192.210.160.141 port 36022 [preauth]
Sep 30 18:51:09 compute-1 ceph-mon[75484]: pgmap v2185: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:09.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:09 compute-1 nova_compute[238822]: 2025-09-30 18:51:09.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:09.776 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:84:e3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-22f29271-35aa-4de9-8453-2dab07456294', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22f29271-35aa-4de9-8453-2dab07456294', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=661e58a3-2ec2-44de-9298-150cee8d1105, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7c3de4a3-3879-465a-88ae-6faa2c17e570) old=Port_Binding(mac=['fa:16:3e:1c:84:e3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-22f29271-35aa-4de9-8453-2dab07456294', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22f29271-35aa-4de9-8453-2dab07456294', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:51:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:09.777 144543 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7c3de4a3-3879-465a-88ae-6faa2c17e570 in datapath 22f29271-35aa-4de9-8453-2dab07456294 updated
Sep 30 18:51:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:09.779 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22f29271-35aa-4de9-8453-2dab07456294, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:51:09 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:09.780 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa71564-e0d9-4ad3-9f3f-89d88a85361c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:10 compute-1 nova_compute[238822]: 2025-09-30 18:51:10.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:11 compute-1 ceph-mon[75484]: pgmap v2186: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:51:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.354426) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271354478, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 2265, "num_deletes": 258, "total_data_size": 5614295, "memory_usage": 5683328, "flush_reason": "Manual Compaction"}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271378126, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 3620999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 57324, "largest_seqno": 59583, "table_properties": {"data_size": 3612026, "index_size": 5531, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18943, "raw_average_key_size": 20, "raw_value_size": 3593844, "raw_average_value_size": 3811, "num_data_blocks": 241, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258073, "oldest_key_time": 1759258073, "file_creation_time": 1759258271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 23828 microseconds, and 15592 cpu microseconds.
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.378246) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 3620999 bytes OK
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.378282) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.379886) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.379909) EVENT_LOG_v1 {"time_micros": 1759258271379901, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.379939) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 5604182, prev total WAL file size 5604182, number of live WAL files 2.
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.382354) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303037' seq:72057594037927935, type:22 .. '6C6F676D0032323631' seq:0, type:0; will stop at (end)
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(3536KB)], [117(11MB)]
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271382430, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 15220962, "oldest_snapshot_seqno": -1}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 7871 keys, 15073961 bytes, temperature: kUnknown
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271472716, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 15073961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15023250, "index_size": 29927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19717, "raw_key_size": 207209, "raw_average_key_size": 26, "raw_value_size": 14884340, "raw_average_value_size": 1891, "num_data_blocks": 1176, "num_entries": 7871, "num_filter_entries": 7871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:51:11 compute-1 nova_compute[238822]: 2025-09-30 18:51:11.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:51:11 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:11.475 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.474322) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 15073961 bytes
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.476288) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.2 rd, 164.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 11.1 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(8.4) write-amplify(4.2) OK, records in: 8403, records dropped: 532 output_compression: NoCompression
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.476335) EVENT_LOG_v1 {"time_micros": 1759258271476315, "job": 74, "event": "compaction_finished", "compaction_time_micros": 91576, "compaction_time_cpu_micros": 59049, "output_level": 6, "num_output_files": 1, "total_output_size": 15073961, "num_input_records": 8403, "num_output_records": 7871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:51:11 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:11.476 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271478053, "job": 74, "event": "table_file_deletion", "file_number": 119}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258271482535, "job": 74, "event": "table_file_deletion", "file_number": 117}
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.382206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.482862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.482876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.482882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.482885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:11.482889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:11.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:12.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:13 compute-1 nova_compute[238822]: 2025-09-30 18:51:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:51:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:13 compute-1 ceph-mon[75484]: pgmap v2187: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:13 compute-1 sshd-session[305601]: Invalid user shin from 161.132.50.17 port 43922
Sep 30 18:51:13 compute-1 sshd-session[305601]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:51:13 compute-1 sshd-session[305601]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:51:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:13.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:14.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:14 compute-1 nova_compute[238822]: 2025-09-30 18:51:14.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:15 compute-1 sshd-session[305601]: Failed password for invalid user shin from 161.132.50.17 port 43922 ssh2
Sep 30 18:51:15 compute-1 ceph-mon[75484]: pgmap v2188: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:51:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:15.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:15 compute-1 nova_compute[238822]: 2025-09-30 18:51:15.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:15 compute-1 sshd-session[305601]: Received disconnect from 161.132.50.17 port 43922:11: Bye Bye [preauth]
Sep 30 18:51:15 compute-1 sshd-session[305601]: Disconnected from invalid user shin 161.132.50.17 port 43922 [preauth]
Sep 30 18:51:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:16.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:17 compute-1 ceph-mon[75484]: pgmap v2189: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:17.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:18 compute-1 nova_compute[238822]: 2025-09-30 18:51:18.061 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:51:18 compute-1 sshd-session[305607]: Invalid user nurul from 8.243.64.201 port 47582
Sep 30 18:51:18 compute-1 sshd-session[305607]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:51:18 compute-1 sshd-session[305607]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:51:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:18.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e144 e144: 2 total, 2 up, 2 in
Sep 30 18:51:18 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:18.478 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: ERROR   18:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: ERROR   18:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: ERROR   18:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: ERROR   18:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: ERROR   18:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:51:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:51:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e145 e145: 2 total, 2 up, 2 in
Sep 30 18:51:19 compute-1 ceph-mon[75484]: pgmap v2190: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:51:19 compute-1 ceph-mon[75484]: osdmap e144: 2 total, 2 up, 2 in
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.501435) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279501520, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 345, "num_deletes": 251, "total_data_size": 268928, "memory_usage": 276728, "flush_reason": "Manual Compaction"}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279506794, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 176245, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59588, "largest_seqno": 59928, "table_properties": {"data_size": 174092, "index_size": 317, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5417, "raw_average_key_size": 18, "raw_value_size": 169834, "raw_average_value_size": 583, "num_data_blocks": 14, "num_entries": 291, "num_filter_entries": 291, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258272, "oldest_key_time": 1759258272, "file_creation_time": 1759258279, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 5403 microseconds, and 2438 cpu microseconds.
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.506851) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 176245 bytes OK
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.506876) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.514172) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.514198) EVENT_LOG_v1 {"time_micros": 1759258279514191, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.514229) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 266562, prev total WAL file size 266562, number of live WAL files 2.
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.514928) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(172KB)], [120(14MB)]
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279514982, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 15250206, "oldest_snapshot_seqno": -1}
Sep 30 18:51:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:19.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 7648 keys, 13225705 bytes, temperature: kUnknown
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279609083, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 13225705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13177833, "index_size": 27664, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19141, "raw_key_size": 203264, "raw_average_key_size": 26, "raw_value_size": 13044090, "raw_average_value_size": 1705, "num_data_blocks": 1074, "num_entries": 7648, "num_filter_entries": 7648, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258279, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.609513) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 13225705 bytes
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.611053) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.8 rd, 140.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 14.4 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(161.6) write-amplify(75.0) OK, records in: 8162, records dropped: 514 output_compression: NoCompression
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.611095) EVENT_LOG_v1 {"time_micros": 1759258279611074, "job": 76, "event": "compaction_finished", "compaction_time_micros": 94228, "compaction_time_cpu_micros": 53673, "output_level": 6, "num_output_files": 1, "total_output_size": 13225705, "num_input_records": 8162, "num_output_records": 7648, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279611390, "job": 76, "event": "table_file_deletion", "file_number": 122}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258279616945, "job": 76, "event": "table_file_deletion", "file_number": 120}
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.514807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.617044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.617055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.617059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.617061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:51:19.617064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:51:19 compute-1 sshd-session[305607]: Failed password for invalid user nurul from 8.243.64.201 port 47582 ssh2
Sep 30 18:51:19 compute-1 nova_compute[238822]: 2025-09-30 18:51:19.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:20.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:20 compute-1 sshd-session[305607]: Received disconnect from 8.243.64.201 port 47582:11: Bye Bye [preauth]
Sep 30 18:51:20 compute-1 sshd-session[305607]: Disconnected from invalid user nurul 8.243.64.201 port 47582 [preauth]
Sep 30 18:51:20 compute-1 ceph-mon[75484]: osdmap e145: 2 total, 2 up, 2 in
Sep 30 18:51:20 compute-1 nova_compute[238822]: 2025-09-30 18:51:20.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:21 compute-1 ceph-mon[75484]: pgmap v2193: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:51:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:21.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:21 compute-1 podman[305616]: 2025-09-30 18:51:21.569080335 +0000 UTC m=+0.088809315 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:51:21 compute-1 podman[305615]: 2025-09-30 18:51:21.664802454 +0000 UTC m=+0.192886808 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller)
Sep 30 18:51:22 compute-1 unix_chkpwd[305663]: password check failed for user (root)
Sep 30 18:51:22 compute-1 sshd-session[305612]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 18:51:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:51:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:22.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:51:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:22 compute-1 nova_compute[238822]: 2025-09-30 18:51:22.215 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:22 compute-1 nova_compute[238822]: 2025-09-30 18:51:22.215 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:51:22 compute-1 nova_compute[238822]: 2025-09-30 18:51:22.723 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:51:23 compute-1 sudo[305666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:51:23 compute-1 sudo[305666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:23 compute-1 sudo[305666]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:23 compute-1 nova_compute[238822]: 2025-09-30 18:51:23.284 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:23 compute-1 nova_compute[238822]: 2025-09-30 18:51:23.285 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:23 compute-1 nova_compute[238822]: 2025-09-30 18:51:23.297 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:51:23 compute-1 nova_compute[238822]: 2025-09-30 18:51:23.297 2 INFO nova.compute.claims [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:51:23 compute-1 ceph-mon[75484]: pgmap v2194: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 383 B/s rd, 0 op/s
Sep 30 18:51:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:23.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:24 compute-1 sshd-session[305612]: Failed password for root from 49.49.32.245 port 41918 ssh2
Sep 30 18:51:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:24.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:24 compute-1 nova_compute[238822]: 2025-09-30 18:51:24.349 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:24 compute-1 podman[305693]: 2025-09-30 18:51:24.575327612 +0000 UTC m=+0.113028124 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 18:51:24 compute-1 nova_compute[238822]: 2025-09-30 18:51:24.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:51:24 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1106233728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:24 compute-1 nova_compute[238822]: 2025-09-30 18:51:24.841 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:24 compute-1 nova_compute[238822]: 2025-09-30 18:51:24.850 2 DEBUG nova.compute.provider_tree [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:51:25 compute-1 sshd-session[305612]: Received disconnect from 49.49.32.245 port 41918:11: Bye Bye [preauth]
Sep 30 18:51:25 compute-1 sshd-session[305612]: Disconnected from authenticating user root 49.49.32.245 port 41918 [preauth]
Sep 30 18:51:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:25 compute-1 nova_compute[238822]: 2025-09-30 18:51:25.363 2 DEBUG nova.scheduler.client.report [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:51:25 compute-1 ceph-mon[75484]: pgmap v2195: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:51:25 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1106233728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:51:25 compute-1 nova_compute[238822]: 2025-09-30 18:51:25.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:25.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:25 compute-1 nova_compute[238822]: 2025-09-30 18:51:25.878 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.592s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:25 compute-1 nova_compute[238822]: 2025-09-30 18:51:25.879 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:51:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:51:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:26.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:51:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:26 compute-1 ovn_controller[135204]: 2025-09-30T18:51:26Z|00315|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 18:51:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 e146: 2 total, 2 up, 2 in
Sep 30 18:51:26 compute-1 nova_compute[238822]: 2025-09-30 18:51:26.395 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:51:26 compute-1 nova_compute[238822]: 2025-09-30 18:51:26.396 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:51:26 compute-1 nova_compute[238822]: 2025-09-30 18:51:26.396 2 WARNING neutronclient.v2_0.client [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:51:26 compute-1 nova_compute[238822]: 2025-09-30 18:51:26.397 2 WARNING neutronclient.v2_0.client [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:51:26 compute-1 nova_compute[238822]: 2025-09-30 18:51:26.908 2 INFO nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:51:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:27 compute-1 ceph-mon[75484]: pgmap v2196: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 1.1 KiB/s rd, 767 B/s wr, 2 op/s
Sep 30 18:51:27 compute-1 ceph-mon[75484]: osdmap e146: 2 total, 2 up, 2 in
Sep 30 18:51:27 compute-1 nova_compute[238822]: 2025-09-30 18:51:27.419 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:51:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:27.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:27 compute-1 nova_compute[238822]: 2025-09-30 18:51:27.738 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Successfully created port: befcf2ec-342b-4ac9-8ac7-d935271b42ac _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:51:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:28.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.439 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.441 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.442 2 INFO nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Creating image(s)
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.482 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.523 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.563 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.569 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.587 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Successfully updated port: befcf2ec-342b-4ac9-8ac7-d935271b42ac _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.596 2 DEBUG nova.compute.manager [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-changed-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.597 2 DEBUG nova.compute.manager [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Refreshing instance network info cache due to event network-changed-befcf2ec-342b-4ac9-8ac7-d935271b42ac. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.597 2 DEBUG oslo_concurrency.lockutils [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.597 2 DEBUG oslo_concurrency.lockutils [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.598 2 DEBUG nova.network.neutron [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Refreshing network info cache for port befcf2ec-342b-4ac9-8ac7-d935271b42ac _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.662 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.662 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.663 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.663 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.695 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:28 compute-1 nova_compute[238822]: 2025-09-30 18:51:28.700 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.027 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.133 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.134 2 WARNING neutronclient.v2_0.client [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.148 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] resizing rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:51:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.307 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.308 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Ensure instance console log exists: /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.309 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.309 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.310 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:29 compute-1 ceph-mon[75484]: pgmap v2198: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 724 B/s rd, 724 B/s wr, 1 op/s
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.500 2 DEBUG nova.network.neutron [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:51:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:29.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.670 2 DEBUG nova.network.neutron [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:51:29 compute-1 nova_compute[238822]: 2025-09-30 18:51:29.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:30 compute-1 nova_compute[238822]: 2025-09-30 18:51:30.179 2 DEBUG oslo_concurrency.lockutils [req-c0372bea-4272-42d3-84e2-20130e107316 req-4861e970-48fa-4014-8825-2d76fb18dac3 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:51:30 compute-1 nova_compute[238822]: 2025-09-30 18:51:30.180 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquired lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:51:30 compute-1 nova_compute[238822]: 2025-09-30 18:51:30.180 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:51:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:30.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:30 compute-1 podman[305907]: 2025-09-30 18:51:30.57405687 +0000 UTC m=+0.105727859 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:51:30 compute-1 podman[305909]: 2025-09-30 18:51:30.590680966 +0000 UTC m=+0.111069442 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:51:30 compute-1 podman[305908]: 2025-09-30 18:51:30.602521174 +0000 UTC m=+0.131541572 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git)
Sep 30 18:51:30 compute-1 nova_compute[238822]: 2025-09-30 18:51:30.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:31 compute-1 nova_compute[238822]: 2025-09-30 18:51:31.132 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:51:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:31 compute-1 nova_compute[238822]: 2025-09-30 18:51:31.348 2 WARNING neutronclient.v2_0.client [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:51:31 compute-1 ceph-mon[75484]: pgmap v2199: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 921 B/s rd, 614 B/s wr, 1 op/s
Sep 30 18:51:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:31.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:31 compute-1 nova_compute[238822]: 2025-09-30 18:51:31.613 2 DEBUG nova.network.neutron [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Updating instance_info_cache with network_info: [{"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.120 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Releasing lock "refresh_cache-ecdb485e-5297-4bbc-bed9-9f019a69b1e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.121 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance network_info: |[{"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.125 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Start _get_guest_xml network_info=[{"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.131 2 WARNING nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.133 2 DEBUG nova.virt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1465799852', uuid='ecdb485e-5297-4bbc-bed9-9f019a69b1e0'), owner=OwnerMeta(userid='e80b7fccb5a34c13b356857340eff1ee', username='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin', projectid='127ca83529de45efa0a76aa8ceefcd3d', projectname='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759258292.133392) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.141 2 DEBUG nova.virt.libvirt.host [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.142 2 DEBUG nova.virt.libvirt.host [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.146 2 DEBUG nova.virt.libvirt.host [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.147 2 DEBUG nova.virt.libvirt.host [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.147 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.148 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.149 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.149 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.150 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.150 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.150 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.151 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.151 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.152 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.152 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.153 2 DEBUG nova.virt.hardware [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.157 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:32.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:32 compute-1 sshd-session[305906]: Invalid user abc from 192.210.160.141 port 55500
Sep 30 18:51:32 compute-1 sshd-session[305906]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:51:32 compute-1 sshd-session[305906]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:51:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:51:32 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/374989348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.657 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.687 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:32 compute-1 nova_compute[238822]: 2025-09-30 18:51:32.693 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:33 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:51:33 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2330939330' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.234 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.236 2 DEBUG nova.virt.libvirt.vif [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:51:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1465799852',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-146579985',id=40,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-gfwma97m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:51:27Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=ecdb485e-5297-4bbc-bed9-9f019a69b1e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.237 2 DEBUG nova.network.os_vif_util [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.238 2 DEBUG nova.network.os_vif_util [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.239 2 DEBUG nova.objects.instance [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'pci_devices' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:51:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:33 compute-1 ceph-mon[75484]: pgmap v2200: 353 pgs: 353 active+clean; 41 MiB data, 396 MiB used, 40 GiB / 40 GiB avail; 921 B/s rd, 614 B/s wr, 1 op/s
Sep 30 18:51:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/374989348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:51:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2330939330' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:51:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:33.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.748 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <uuid>ecdb485e-5297-4bbc-bed9-9f019a69b1e0</uuid>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <name>instance-00000028</name>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteZoneMigrationStrategyVolume-server-1465799852</nova:name>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:51:32</nova:creationTime>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:51:33 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:51:33 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:user uuid="e80b7fccb5a34c13b356857340eff1ee">tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin</nova:user>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:project uuid="127ca83529de45efa0a76aa8ceefcd3d">tempest-TestExecuteZoneMigrationStrategyVolume-1619382540</nova:project>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <nova:port uuid="befcf2ec-342b-4ac9-8ac7-d935271b42ac">
Sep 30 18:51:33 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <system>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="serial">ecdb485e-5297-4bbc-bed9-9f019a69b1e0</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="uuid">ecdb485e-5297-4bbc-bed9-9f019a69b1e0</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </system>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <os>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </os>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <features>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </features>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </source>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </source>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:51:33 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:38:1e:87"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <target dev="tapbefcf2ec-34"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/console.log" append="off"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <video>
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </video>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:51:33 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:51:33 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:51:33 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:51:33 compute-1 nova_compute[238822]: </domain>
Sep 30 18:51:33 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.749 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Preparing to wait for external event network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.749 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.750 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.750 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.752 2 DEBUG nova.virt.libvirt.vif [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:51:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1465799852',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-146579985',id=40,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-gfwma97m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:51:27Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=ecdb485e-5297-4bbc-bed9-9f019a69b1e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.752 2 DEBUG nova.network.os_vif_util [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.753 2 DEBUG nova.network.os_vif_util [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.753 2 DEBUG os_vif [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.757 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9a9ced06-7534-5094-9962-c57c680065b1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbefcf2ec-34, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.768 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbefcf2ec-34, col_values=(('qos', UUID('34f0a051-ee89-4790-9397-45b4272b869d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.768 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbefcf2ec-34, col_values=(('external_ids', {'iface-id': 'befcf2ec-342b-4ac9-8ac7-d935271b42ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:1e:87', 'vm-uuid': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:33 compute-1 NetworkManager[45549]: <info>  [1759258293.7716] manager: (tapbefcf2ec-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:33 compute-1 nova_compute[238822]: 2025-09-30 18:51:33.783 2 INFO os_vif [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34')
Sep 30 18:51:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:34.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:34 compute-1 sshd-session[305906]: Failed password for invalid user abc from 192.210.160.141 port 55500 ssh2
Sep 30 18:51:34 compute-1 sshd-session[305906]: Connection closed by invalid user abc 192.210.160.141 port 55500 [preauth]
Sep 30 18:51:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.333 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.334 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.334 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No VIF found with MAC fa:16:3e:38:1e:87, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.335 2 INFO nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Using config drive
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.376 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:35 compute-1 ceph-mon[75484]: pgmap v2201: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 33 op/s
Sep 30 18:51:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:51:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:35.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 18:51:35 compute-1 podman[249638]: time="2025-09-30T18:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:51:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8364 "" "Go-http-client/1.1"
Sep 30 18:51:35 compute-1 nova_compute[238822]: 2025-09-30 18:51:35.897 2 WARNING neutronclient.v2_0.client [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:36.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.256 2 INFO nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Creating config drive at /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.267 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmplf8yko78 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.424 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmplf8yko78" returned: 0 in 0.157s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.470 2 DEBUG nova.storage.rbd_utils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.477 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.689 2 DEBUG oslo_concurrency.processutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config ecdb485e-5297-4bbc-bed9-9f019a69b1e0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.212s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.691 2 INFO nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Deleting local config drive /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0/disk.config because it was imported into RBD.
Sep 30 18:51:36 compute-1 kernel: tapbefcf2ec-34: entered promiscuous mode
Sep 30 18:51:36 compute-1 NetworkManager[45549]: <info>  [1759258296.7953] manager: (tapbefcf2ec-34): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:36 compute-1 ovn_controller[135204]: 2025-09-30T18:51:36Z|00316|binding|INFO|Claiming lport befcf2ec-342b-4ac9-8ac7-d935271b42ac for this chassis.
Sep 30 18:51:36 compute-1 ovn_controller[135204]: 2025-09-30T18:51:36Z|00317|binding|INFO|befcf2ec-342b-4ac9-8ac7-d935271b42ac: Claiming fa:16:3e:38:1e:87 10.100.0.9
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:36 compute-1 systemd-udevd[306102]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.865 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:1e:87 10.100.0.9'], port_security=['fa:16:3e:38:1e:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '126a2d65-c072-4128-836f-db6080f798dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3bc33b-b1e3-4a2f-8784-2e8238744730, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=befcf2ec-342b-4ac9-8ac7-d935271b42ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.867 144543 INFO neutron.agent.ovn.metadata.agent [-] Port befcf2ec-342b-4ac9-8ac7-d935271b42ac in datapath cee64377-b6b9-46f2-8d77-c7978d4cc7a0 bound to our chassis
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.869 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:51:36 compute-1 NetworkManager[45549]: <info>  [1759258296.8793] device (tapbefcf2ec-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:51:36 compute-1 NetworkManager[45549]: <info>  [1759258296.8808] device (tapbefcf2ec-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.891 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[fa185914-fba4-4130-932e-f893ef1927b1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.892 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcee64377-b1 in ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.894 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcee64377-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.894 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb9c2fe-236c-4a11-91a7-e1cd87004d5a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.896 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d93ab8-f01d-437f-a2d9-b13fb4d3bad0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 systemd-machined[195911]: New machine qemu-30-instance-00000028.
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.917 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[21816e62-108f-48b9-9f93-21f8f003213a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 systemd[1]: Started Virtual Machine qemu-30-instance-00000028.
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.940 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0e50e9ab-9592-45e4-88cb-a878c6df2b45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 ovn_controller[135204]: 2025-09-30T18:51:36Z|00318|binding|INFO|Setting lport befcf2ec-342b-4ac9-8ac7-d935271b42ac ovn-installed in OVS
Sep 30 18:51:36 compute-1 ovn_controller[135204]: 2025-09-30T18:51:36Z|00319|binding|INFO|Setting lport befcf2ec-342b-4ac9-8ac7-d935271b42ac up in Southbound
Sep 30 18:51:36 compute-1 nova_compute[238822]: 2025-09-30 18:51:36.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.985 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[316f4df6-30cc-4f0d-8e68-08ba2699ce10]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:36.994 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[40d008f9-0a7f-4b56-8079-4e32f4903fd8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:36 compute-1 NetworkManager[45549]: <info>  [1759258296.9958] manager: (tapcee64377-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.054 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[07f946a4-35a4-428d-9fc7-8a6b4cda715f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.062 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[c0046e79-5e2f-46ed-84ad-ffd8f020cfea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 NetworkManager[45549]: <info>  [1759258297.1038] device (tapcee64377-b0): carrier: link connected
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.116 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[73c12d9e-e15b-4022-80bb-9edad4f495df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.149 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d4fbb4-7644-4c98-9d6f-eebe9b40b902]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee64377-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:48:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1613868, 'reachable_time': 44923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306139, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.172 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[53db75b1-c90d-4373-a364-d73f01e80ebc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:48c5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1613868, 'tstamp': 1613868}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306140, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.202 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a916ecd3-2a6a-41d1-a60c-dd14fc23efb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee64377-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:48:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1613868, 'reachable_time': 44923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306141, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.257 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[de965a59-47b0-413f-ac68-8d5004c848ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.359 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[91266499-b8c2-4348-970a-cd0c947b64ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.361 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee64377-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.362 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.363 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcee64377-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:37 compute-1 NetworkManager[45549]: <info>  [1759258297.3664] manager: (tapcee64377-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Sep 30 18:51:37 compute-1 kernel: tapcee64377-b0: entered promiscuous mode
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.369 2 DEBUG nova.compute.manager [req-19048b3c-7a2a-489d-972b-9ba6846d68e5 req-af6d7177-eec8-4381-8516-967f1dfca741 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.370 2 DEBUG oslo_concurrency.lockutils [req-19048b3c-7a2a-489d-972b-9ba6846d68e5 req-af6d7177-eec8-4381-8516-967f1dfca741 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.371 2 DEBUG oslo_concurrency.lockutils [req-19048b3c-7a2a-489d-972b-9ba6846d68e5 req-af6d7177-eec8-4381-8516-967f1dfca741 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.371 2 DEBUG oslo_concurrency.lockutils [req-19048b3c-7a2a-489d-972b-9ba6846d68e5 req-af6d7177-eec8-4381-8516-967f1dfca741 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.372 2 DEBUG nova.compute.manager [req-19048b3c-7a2a-489d-972b-9ba6846d68e5 req-af6d7177-eec8-4381-8516-967f1dfca741 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Processing event network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.377 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcee64377-b0, col_values=(('external_ids', {'iface-id': 'b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:51:37 compute-1 ovn_controller[135204]: 2025-09-30T18:51:37Z|00320|binding|INFO|Releasing lport b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa from this chassis (sb_readonly=0)
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:37 compute-1 nova_compute[238822]: 2025-09-30 18:51:37.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.405 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2b35b862-896b-4291-92aa-e878c12a601e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.406 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.407 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.407 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for cee64377-b6b9-46f2-8d77-c7978d4cc7a0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.407 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.408 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca47c90-feb0-457f-8258-6d16ee35fff9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.409 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.410 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f6362bbf-1fb7-4b50-9a39-5bf50fb75ae9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.411 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:51:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:37.412 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'env', 'PROCESS_TAG=haproxy-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:51:37 compute-1 ceph-mon[75484]: pgmap v2202: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 33 op/s
Sep 30 18:51:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3397859791' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:51:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3397859791' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:51:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:51:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:37.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:37 compute-1 podman[306214]: 2025-09-30 18:51:37.897388668 +0000 UTC m=+0.085979849 container create 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 18:51:37 compute-1 systemd[1]: Started libpod-conmon-9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb.scope.
Sep 30 18:51:37 compute-1 podman[306214]: 2025-09-30 18:51:37.857458096 +0000 UTC m=+0.046049337 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:51:37 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:51:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe69756a8a39791c565ebc9a2b538789801a1ed231491420fbe4cf9e60b332bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:51:37 compute-1 podman[306214]: 2025-09-30 18:51:37.999001225 +0000 UTC m=+0.187592496 container init 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:51:38 compute-1 podman[306214]: 2025-09-30 18:51:38.012011014 +0000 UTC m=+0.200602225 container start 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 18:51:38 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [NOTICE]   (306234) : New worker (306236) forked
Sep 30 18:51:38 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [NOTICE]   (306234) : Loading success.
Sep 30 18:51:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:38.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.331 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.336 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.340 2 INFO nova.virt.libvirt.driver [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance spawned successfully.
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.341 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.860 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.861 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.862 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.862 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.863 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:38 compute-1 nova_compute[238822]: 2025-09-30 18:51:38.864 2 DEBUG nova.virt.libvirt.driver [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:51:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.376 2 INFO nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Took 10.94 seconds to spawn the instance on the hypervisor.
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.377 2 DEBUG nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.459 2 DEBUG nova.compute.manager [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.460 2 DEBUG oslo_concurrency.lockutils [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.460 2 DEBUG oslo_concurrency.lockutils [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.460 2 DEBUG oslo_concurrency.lockutils [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.461 2 DEBUG nova.compute.manager [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] No waiting events found dispatching network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.461 2 WARNING nova.compute.manager [req-cb0f95dc-f683-4d0b-9c7e-e9bc1172e9f7 req-9ebec7a0-4a35-49a6-8176-4e8d6a85dad7 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received unexpected event network-vif-plugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac for instance with vm_state building and task_state spawning.
Sep 30 18:51:39 compute-1 ceph-mon[75484]: pgmap v2203: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Sep 30 18:51:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:39.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:39 compute-1 nova_compute[238822]: 2025-09-30 18:51:39.926 2 INFO nova.compute.manager [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Took 16.69 seconds to build instance.
Sep 30 18:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:40.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:40 compute-1 nova_compute[238822]: 2025-09-30 18:51:40.435 2 DEBUG oslo_concurrency.lockutils [None req-572bd7b8-93a5-4499-9afb-a29f854a0c57 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.219s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:40 compute-1 nova_compute[238822]: 2025-09-30 18:51:40.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:41 compute-1 ceph-mon[75484]: pgmap v2204: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:51:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:41.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:43 compute-1 sudo[306251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:51:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:43 compute-1 sudo[306251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:43 compute-1 sudo[306251]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:43 compute-1 nova_compute[238822]: 2025-09-30 18:51:43.442 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:43 compute-1 nova_compute[238822]: 2025-09-30 18:51:43.442 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:43 compute-1 ceph-mon[75484]: pgmap v2205: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:51:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:43.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:43 compute-1 nova_compute[238822]: 2025-09-30 18:51:43.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:43 compute-1 nova_compute[238822]: 2025-09-30 18:51:43.954 2 DEBUG nova.objects.instance [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:51:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:44.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:44 compute-1 nova_compute[238822]: 2025-09-30 18:51:44.974 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.532s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:45 compute-1 ceph-mon[75484]: pgmap v2206: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Sep 30 18:51:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:45.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:45 compute-1 nova_compute[238822]: 2025-09-30 18:51:45.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.152 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.153 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.153 2 INFO nova.compute.manager [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Attaching volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 to /dev/vdb
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.154 2 DEBUG nova.objects.instance [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:46.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.894 2 DEBUG os_brick.utils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Sep 30 18:51:46 compute-1 nova_compute[238822]: 2025-09-30 18:51:46.897 2 INFO oslo.privsep.daemon [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpedwd8ozx/privsep.sock']
Sep 30 18:51:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:47 compute-1 ceph-mon[75484]: pgmap v2207: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:51:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:47.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.701 2 INFO oslo.privsep.daemon [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Spawned new privsep daemon via rootwrap
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.523 8181 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.530 8181 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.532 8181 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/none
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.532 8181 INFO oslo.privsep.daemon [-] privsep daemon running as pid 8181
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.706 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[d56f1ecb-fb79-4597-a760-b47491ee0e0f]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.779 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.788 8181 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.788 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[d56e1445-49f1-44b8-9102-8487db71f27c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:d7bbbc2a579e', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.791 8181 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[263bc98b-8baa-42e6-80af-e0a0519dad46]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Sep 30 18:51:47 compute-1 nova_compute[238822]: Traceback (most recent call last):
Sep 30 18:51:47 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Sep 30 18:51:47 compute-1 nova_compute[238822]:     ret = func(*f_args, **f_kwargs)
Sep 30 18:51:47 compute-1 nova_compute[238822]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:51:47 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Sep 30 18:51:47 compute-1 nova_compute[238822]:     return func(*args, **kwargs)
Sep 30 18:51:47 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:51:47 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Sep 30 18:51:47 compute-1 nova_compute[238822]:     with open_scini_device() as fd:
Sep 30 18:51:47 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^
Sep 30 18:51:47 compute-1 nova_compute[238822]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Sep 30 18:51:47 compute-1 nova_compute[238822]:     return next(self.gen)
Sep 30 18:51:47 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^
Sep 30 18:51:47 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Sep 30 18:51:47 compute-1 nova_compute[238822]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Sep 30 18:51:47 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:51:47 compute-1 nova_compute[238822]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.793 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[263bc98b-8baa-42e6-80af-e0a0519dad46]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.794 2 ERROR os_brick.initiator.connectors.scaleio [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.795 2 INFO os_brick.initiator.connectors.scaleio [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.796 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.809 8181 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.809 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[b101ca9c-9255-42b8-afb7-b5233c3c6cc0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.814 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[25feab53-08b9-47be-9fee-1a64fe1cfcfa]: (4, '12ce99da-db91-4763-aecd-1e4b4dea5907') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.815 2 DEBUG oslo_concurrency.processutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.850 2 DEBUG oslo_concurrency.processutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "nvme version" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.855 2 DEBUG os_brick.initiator.connectors.lightos [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.858 2 INFO os_brick.initiator.connectors.lightos [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 and IP(s) are ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe38:1e87', 'fe80::acfb:31ff:fe26:1320'] 
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.859 2 DEBUG os_brick.initiator.connectors.lightos [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.859 2 DEBUG os_brick.initiator.connectors.lightos [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.860 2 DEBUG os_brick.utils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] <== get_connector_properties: return (964ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:d7bbbc2a579e', 'do_local_attach': False, 'nvme_hostid': 'abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'system uuid': '12ce99da-db91-4763-aecd-1e4b4dea5907', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe38:1e87', 'fe80::acfb:31ff:fe26:1320']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Sep 30 18:51:47 compute-1 nova_compute[238822]: 2025-09-30 18:51:47.861 2 DEBUG nova.virt.block_device [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Updating existing volume attachment record: e839173a-359c-48cd-8687-ebe1b215af1d _volume_attach /usr/lib/python3.12/site-packages/nova/virt/block_device.py:666
Sep 30 18:51:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:48.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:48 compute-1 nova_compute[238822]: 2025-09-30 18:51:48.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:49 compute-1 nova_compute[238822]: 2025-09-30 18:51:49.023 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:49 compute-1 nova_compute[238822]: 2025-09-30 18:51:49.023 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:49 compute-1 nova_compute[238822]: 2025-09-30 18:51:49.025 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:49 compute-1 nova_compute[238822]: 2025-09-30 18:51:49.035 2 DEBUG nova.virt.libvirt.driver [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Attempting to attach volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2261
Sep 30 18:51:49 compute-1 nova_compute[238822]: 2025-09-30 18:51:49.038 2 DEBUG nova.virt.libvirt.guest [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] attach device xml: <disk type="network" device="disk">
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <alias name="ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81">
Sep 30 18:51:49 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   </source>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <auth username="openstack">
Sep 30 18:51:49 compute-1 nova_compute[238822]:     <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   </auth>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:51:49 compute-1 nova_compute[238822]:   <serial>48cc17b7-3f84-4dd1-aa6f-dc29cea17e81</serial>
Sep 30 18:51:49 compute-1 nova_compute[238822]: </disk>
Sep 30 18:51:49 compute-1 nova_compute[238822]:  attach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:336
Sep 30 18:51:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: ERROR   18:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: ERROR   18:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: ERROR   18:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: ERROR   18:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: ERROR   18:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:51:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:51:49 compute-1 ceph-mon[75484]: pgmap v2208: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:51:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2461365287' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:51:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:49.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:51:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:50.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:51:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:50 compute-1 nova_compute[238822]: 2025-09-30 18:51:50.713 2 DEBUG nova.virt.libvirt.driver [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:51:50 compute-1 nova_compute[238822]: 2025-09-30 18:51:50.714 2 DEBUG nova.virt.libvirt.driver [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:51:50 compute-1 nova_compute[238822]: 2025-09-30 18:51:50.715 2 DEBUG nova.virt.libvirt.driver [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:51:50 compute-1 nova_compute[238822]: 2025-09-30 18:51:50.715 2 DEBUG nova.virt.libvirt.driver [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No VIF found with MAC fa:16:3e:38:1e:87, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:51:50 compute-1 nova_compute[238822]: 2025-09-30 18:51:50.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:51 compute-1 ceph-mon[75484]: pgmap v2209: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:51:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:51.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:51 compute-1 ovn_controller[135204]: 2025-09-30T18:51:51Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:1e:87 10.100.0.9
Sep 30 18:51:51 compute-1 ovn_controller[135204]: 2025-09-30T18:51:51Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:1e:87 10.100.0.9
Sep 30 18:51:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:52.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:52 compute-1 nova_compute[238822]: 2025-09-30 18:51:52.551 2 DEBUG oslo_concurrency.lockutils [None req-f92e2ae4-95ae-4ca6-94d8-15f454ae15a7 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 6.398s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:52 compute-1 podman[306318]: 2025-09-30 18:51:52.55654732 +0000 UTC m=+0.085034943 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:51:52 compute-1 podman[306317]: 2025-09-30 18:51:52.595585538 +0000 UTC m=+0.138017545 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 18:51:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:51:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:53.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:53 compute-1 ceph-mon[75484]: pgmap v2210: 353 pgs: 353 active+clean; 88 MiB data, 417 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 341 B/s wr, 65 op/s
Sep 30 18:51:53 compute-1 nova_compute[238822]: 2025-09-30 18:51:53.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:51:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:54.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:51:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:54.438 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:54.438 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:51:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:51:54.439 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:51:55 compute-1 sudo[306366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:51:55 compute-1 sudo[306366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:55 compute-1 sudo[306366]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:55 compute-1 podman[306390]: 2025-09-30 18:51:55.237302682 +0000 UTC m=+0.077085110 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 18:51:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:55 compute-1 sudo[306397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 18:51:55 compute-1 sudo[306397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:55.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:55 compute-1 ceph-mon[75484]: pgmap v2211: 353 pgs: 353 active+clean; 121 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 135 op/s
Sep 30 18:51:55 compute-1 nova_compute[238822]: 2025-09-30 18:51:55.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:56 compute-1 podman[306511]: 2025-09-30 18:51:56.061051782 +0000 UTC m=+0.106117330 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Sep 30 18:51:56 compute-1 podman[306511]: 2025-09-30 18:51:56.187079184 +0000 UTC m=+0.232144662 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Sep 30 18:51:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:56.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:56 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 18:51:56 compute-1 podman[306632]: 2025-09-30 18:51:56.98577001 +0000 UTC m=+0.096006798 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:51:56 compute-1 podman[306632]: 2025-09-30 18:51:56.995376608 +0000 UTC m=+0.105613406 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 18:51:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:57 compute-1 unix_chkpwd[306698]: password check failed for user (root)
Sep 30 18:51:57 compute-1 sshd-session[306435]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:51:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:57.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:57 compute-1 ceph-mon[75484]: pgmap v2212: 353 pgs: 353 active+clean; 121 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 405 KiB/s rd, 2.1 MiB/s wr, 70 op/s
Sep 30 18:51:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3734450267' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:51:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3734450267' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:51:57 compute-1 podman[306770]: 2025-09-30 18:51:57.823958287 +0000 UTC m=+0.091110576 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:51:57 compute-1 podman[306770]: 2025-09-30 18:51:57.836052502 +0000 UTC m=+0.103204741 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 18:51:58 compute-1 podman[306836]: 2025-09-30 18:51:58.202326323 +0000 UTC m=+0.092354490 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vendor=Red Hat, Inc., release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, architecture=x86_64, build-date=2023-02-22T09:23:20)
Sep 30 18:51:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:58 compute-1 podman[306836]: 2025-09-30 18:51:58.220684596 +0000 UTC m=+0.110712763 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., release=1793)
Sep 30 18:51:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:51:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:51:58.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:51:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:58 compute-1 sudo[306397]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:58 compute-1 sudo[306908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:51:58 compute-1 sudo[306908]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:58 compute-1 sudo[306908]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:58 compute-1 nova_compute[238822]: 2025-09-30 18:51:58.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:51:58 compute-1 sudo[306933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:51:58 compute-1 sudo[306933]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:51:59 compute-1 sshd-session[306435]: Failed password for root from 192.210.160.141 port 35392 ssh2
Sep 30 18:51:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:51:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:51:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:51:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:51:59 compute-1 sudo[306933]: pam_unix(sudo:session): session closed for user root
Sep 30 18:51:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:51:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:51:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:51:59.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:51:59 compute-1 ceph-mon[75484]: pgmap v2213: 353 pgs: 353 active+clean; 121 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 405 KiB/s rd, 2.1 MiB/s wr, 70 op/s
Sep 30 18:51:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:51:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:51:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 18:51:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:51:59 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:52:00 compute-1 nova_compute[238822]: 2025-09-30 18:52:00.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:00 compute-1 nova_compute[238822]: 2025-09-30 18:52:00.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:52:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:00.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:00 compute-1 sshd-session[306435]: Connection closed by authenticating user root 192.210.160.141 port 35392 [preauth]
Sep 30 18:52:00 compute-1 ceph-mon[75484]: pgmap v2214: 353 pgs: 353 active+clean; 121 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 73 op/s
Sep 30 18:52:00 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:52:00 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:52:00 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:52:00 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:52:00 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:52:00 compute-1 nova_compute[238822]: 2025-09-30 18:52:00.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:01 compute-1 podman[306991]: 2025-09-30 18:52:01.576963708 +0000 UTC m=+0.109708965 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:01 compute-1 podman[306992]: 2025-09-30 18:52:01.579871297 +0000 UTC m=+0.111035372 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.579 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.580 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.580 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:01 compute-1 podman[306993]: 2025-09-30 18:52:01.584671355 +0000 UTC m=+0.107723432 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:52:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:01.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.773 2 DEBUG oslo_concurrency.lockutils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.swap_volume.<locals>._do_locked_swap_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.774 2 DEBUG oslo_concurrency.lockutils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.swap_volume.<locals>._do_locked_swap_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:01 compute-1 nova_compute[238822]: 2025-09-30 18:52:01.775 2 DEBUG nova.objects.instance [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lazy-loading 'flavor' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:52:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:52:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2324469393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.051 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:02.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:02 compute-1 ceph-mon[75484]: pgmap v2215: 353 pgs: 353 active+clean; 121 MiB data, 450 MiB used, 40 GiB / 40 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 73 op/s
Sep 30 18:52:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2324469393' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.803 2 DEBUG os_brick.utils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.805 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.820 8181 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.821 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[bb447525-fe50-4d86-8ff9-3d0324bb826d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:d7bbbc2a579e', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.823 8181 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[a746637b-2304-4986-bc54-429201cc762c]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Sep 30 18:52:02 compute-1 nova_compute[238822]: Traceback (most recent call last):
Sep 30 18:52:02 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Sep 30 18:52:02 compute-1 nova_compute[238822]:     ret = func(*f_args, **f_kwargs)
Sep 30 18:52:02 compute-1 nova_compute[238822]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:52:02 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Sep 30 18:52:02 compute-1 nova_compute[238822]:     return func(*args, **kwargs)
Sep 30 18:52:02 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:52:02 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Sep 30 18:52:02 compute-1 nova_compute[238822]:     with open_scini_device() as fd:
Sep 30 18:52:02 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^
Sep 30 18:52:02 compute-1 nova_compute[238822]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Sep 30 18:52:02 compute-1 nova_compute[238822]:     return next(self.gen)
Sep 30 18:52:02 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^
Sep 30 18:52:02 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Sep 30 18:52:02 compute-1 nova_compute[238822]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Sep 30 18:52:02 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:52:02 compute-1 nova_compute[238822]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.825 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[a746637b-2304-4986-bc54-429201cc762c]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.826 2 ERROR os_brick.initiator.connectors.scaleio [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.826 2 INFO os_brick.initiator.connectors.scaleio [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.827 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.844 8181 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.845 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[35be80f4-c2e0-4cc3-a460-6e4835dcd9bb]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.846 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2e9745-d95e-410a-a39f-667f0731b8b7]: (4, '12ce99da-db91-4763-aecd-1e4b4dea5907') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.847 2 DEBUG oslo_concurrency.processutils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.889 2 DEBUG oslo_concurrency.processutils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] CMD "nvme version" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.893 2 DEBUG os_brick.initiator.connectors.lightos [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.895 2 INFO os_brick.initiator.connectors.lightos [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 and IP(s) are ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe38:1e87', 'fe80::acfb:31ff:fe26:1320'] 
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.895 2 DEBUG os_brick.initiator.connectors.lightos [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.896 2 DEBUG os_brick.initiator.connectors.lightos [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Sep 30 18:52:02 compute-1 nova_compute[238822]: 2025-09-30 18:52:02.896 2 DEBUG os_brick.utils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] <== get_connector_properties: return (91ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:d7bbbc2a579e', 'do_local_attach': False, 'nvme_hostid': 'abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'system uuid': '12ce99da-db91-4763-aecd-1e4b4dea5907', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe38:1e87', 'fe80::acfb:31ff:fe26:1320']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.092 2 INFO nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Swapping volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 for 4dca7d46-539f-4ca3-9a67-dfddb0bc9f79
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.122 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.122 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.123 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000028 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:52:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:03 compute-1 sudo[307081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:52:03 compute-1 sudo[307081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:52:03 compute-1 sudo[307081]: pam_unix(sudo:session): session closed for user root
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.336 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.338 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.373 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.375 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4336MB free_disk=39.946693420410156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.376 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.376 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:03.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1594717304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:03 compute-1 nova_compute[238822]: 2025-09-30 18:52:03.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:04 compute-1 ceph-mon[75484]: pgmap v2216: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 74 op/s
Sep 30 18:52:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4121484505' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:52:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:52:04 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.788 2 DEBUG nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] swap_volume: Calling driver volume swap with connection infos: new: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'hosts': ['192.168.122.100', '192.168.122.101'], 'ports': ['6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'enforce_multipath': True}, 'status': 'reserved', 'instance': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0', 'attached_at': '', 'detached_at': '', 'volume_id': '4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'serial': '4dca7d46-539f-4ca3-9a67-dfddb0bc9f79'}; old: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81', 'hosts': ['192.168.122.100', '192.168.122.101'], 'ports': ['6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '48cc17b7-3f84-4dd1-aa6f-dc29cea17e81', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'enforce_multipath': True}, 'status': 'reserved', 'instance': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0', 'attached_at': '', 'detached_at': '', 'volume_id': '48cc17b7-3f84-4dd1-aa6f-dc29cea17e81', 'serial': '} _swap_volume /usr/lib/python3.12/site-packages/nova/compute/manager.py:8323
Sep 30 18:52:04 compute-1 virtqemud[239124]: invalid argument: disk vdb does not have an active block job
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.825 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.826 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.826 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:52:03 up  4:29,  0 user,  load average: 0.61, 0.43, 0.47\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_127ca83529de45efa0a76aa8ceefcd3d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.857 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:04 compute-1 sudo[307110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:52:04 compute-1 sudo[307110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:52:04 compute-1 sudo[307110]: pam_unix(sudo:session): session closed for user root
Sep 30 18:52:04 compute-1 nova_compute[238822]: 2025-09-30 18:52:04.961 2 DEBUG nova.virt.libvirt.guest [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] COPY block job progress, current cursor: 721420288 final cursor: 1073741824 is_job_complete /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:845
Sep 30 18:52:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:52:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2537126038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.363 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.373 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.466 2 DEBUG nova.virt.libvirt.guest [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] COPY block job progress, current cursor: 1073741824 final cursor: 1073741824 is_job_complete /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:845
Sep 30 18:52:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:05.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:05 compute-1 podman[249638]: time="2025-09-30T18:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:52:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37992 "" "Go-http-client/1.1"
Sep 30 18:52:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8828 "" "Go-http-client/1.1"
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.713 2 DEBUG nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] swap_volume: Driver volume swap returned, new connection_info is now : {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'hosts': ['192.168.122.100', '192.168.122.101'], 'ports': ['6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'enforce_multipath': True}, 'status': 'reserved', 'instance': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0', 'attached_at': '', 'detached_at': '', 'volume_id': '4dca7d46-539f-4ca3-9a67-dfddb0bc9f79', 'serial': '} _swap_volume /usr/lib/python3.12/site-packages/nova/compute/manager.py:8332
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.713 2 DEBUG nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] swap_volume: removing Cinder connection for volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 _swap_volume /usr/lib/python3.12/site-packages/nova/compute/manager.py:8377
Sep 30 18:52:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1119088799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2537126038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:05 compute-1 nova_compute[238822]: 2025-09-30 18:52:05.885 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:52:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:06.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:06 compute-1 nova_compute[238822]: 2025-09-30 18:52:06.402 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:52:06 compute-1 nova_compute[238822]: 2025-09-30 18:52:06.402 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.026s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:06 compute-1 ceph-mon[75484]: pgmap v2217: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 969 B/s rd, 16 KiB/s wr, 2 op/s
Sep 30 18:52:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:07 compute-1 ovn_controller[135204]: 2025-09-30T18:52:07Z|00321|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Sep 30 18:52:07 compute-1 nova_compute[238822]: 2025-09-30 18:52:07.399 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:07 compute-1 nova_compute[238822]: 2025-09-30 18:52:07.598 2 DEBUG nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] swap_volume: Cinder migrate_volume_completion returned: {'save_volume_id': '48cc17b7-3f84-4dd1-aa6f-dc29cea17e81'} _swap_volume /usr/lib/python3.12/site-packages/nova/compute/manager.py:8420
Sep 30 18:52:07 compute-1 nova_compute[238822]: 2025-09-30 18:52:07.599 2 DEBUG nova.compute.manager [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] swap_volume: Updating volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 BDM record with {'connection_info': '{"driver_volume_type": "rbd", "data": {"name": "volumes/volume-4dca7d46-539f-4ca3-9a67-dfddb0bc9f79", "hosts": ["192.168.122.100", "192.168.122.101"], "ports": ["6789", "6789"], "cluster_name": "ceph", "auth_enabled": true, "auth_username": "openstack", "secret_type": "ceph", "secret_uuid": "63d32c6a-fa18-54ed-8711-9a3915cc367b", "volume_id": "48cc17b7-3f84-4dd1-aa6f-dc29cea17e81", "discard": true, "qos_specs": null, "access_mode": "rw", "encrypted": false, "cacheable": false, "enforce_multipath": true}, "status": "reserved", "instance": "ecdb485e-5297-4bbc-bed9-9f019a69b1e0", "attached_at": "", "detached_at": "", "volume_id": "48cc17b7-3f84-4dd1-aa6f-dc29cea17e81", "serial": "48cc17b7-3f84-4dd1-aa6f-dc29cea17e81"}', 'source_type': 'volume', 'destination_type': 'volume', 'snapshot_id': None, 'volume_id': '48cc17b7-3f84-4dd1-aa6f-dc29cea17e81', 'no_device': None, 'attachment_id': '3500c303-2028-49ce-8379-34db73026f20'} _do_swap_volume /usr/lib/python3.12/site-packages/nova/compute/manager.py:8525
Sep 30 18:52:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:07.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:52:07 compute-1 nova_compute[238822]: 2025-09-30 18:52:07.913 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:07 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 18:52:08 compute-1 nova_compute[238822]: 2025-09-30 18:52:08.123 2 DEBUG oslo_concurrency.lockutils [req-a55a8050-5662-4497-a9f3-3596a3c8bc70 req-837aa818-4113-465d-b4e6-fd8a045bd084 0a16943781634c8d931ad37fe32d43c8 faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.swap_volume.<locals>._do_locked_swap_volume" :: held 6.350s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:08.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:08 compute-1 nova_compute[238822]: 2025-09-30 18:52:08.566 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:08 compute-1 nova_compute[238822]: 2025-09-30 18:52:08.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:08 compute-1 ceph-mon[75484]: pgmap v2218: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 969 B/s rd, 16 KiB/s wr, 2 op/s
Sep 30 18:52:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:09.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:09 compute-1 ceph-mon[75484]: pgmap v2219: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 14 KiB/s rd, 16 KiB/s wr, 20 op/s
Sep 30 18:52:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:10.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:10 compute-1 nova_compute[238822]: 2025-09-30 18:52:10.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:52:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:11.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:52:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:12.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:12 compute-1 ceph-mon[75484]: pgmap v2220: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 14 KiB/s rd, 4.2 KiB/s wr, 18 op/s
Sep 30 18:52:13 compute-1 nova_compute[238822]: 2025-09-30 18:52:13.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:13.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:13 compute-1 nova_compute[238822]: 2025-09-30 18:52:13.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:14 compute-1 ceph-mon[75484]: pgmap v2221: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 14 KiB/s rd, 5.2 KiB/s wr, 19 op/s
Sep 30 18:52:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:15.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:15 compute-1 nova_compute[238822]: 2025-09-30 18:52:15.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:16 compute-1 ceph-mon[75484]: pgmap v2222: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1.2 KiB/s wr, 17 op/s
Sep 30 18:52:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:17.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:18.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:18 compute-1 ceph-mon[75484]: pgmap v2223: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1.2 KiB/s wr, 17 op/s
Sep 30 18:52:18 compute-1 nova_compute[238822]: 2025-09-30 18:52:18.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:19 compute-1 sshd-session[307192]: Invalid user sakura from 161.132.50.17 port 40506
Sep 30 18:52:19 compute-1 sshd-session[307192]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:52:19 compute-1 sshd-session[307192]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:52:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: ERROR   18:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: ERROR   18:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: ERROR   18:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: ERROR   18:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: ERROR   18:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:52:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:52:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:19.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:20 compute-1 nova_compute[238822]: 2025-09-30 18:52:20.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:52:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:20.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:20 compute-1 ceph-mon[75484]: pgmap v2224: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 1.2 KiB/s wr, 18 op/s
Sep 30 18:52:20 compute-1 nova_compute[238822]: 2025-09-30 18:52:20.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:21 compute-1 sshd-session[307192]: Failed password for invalid user sakura from 161.132.50.17 port 40506 ssh2
Sep 30 18:52:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:21 compute-1 sshd-session[307192]: Received disconnect from 161.132.50.17 port 40506:11: Bye Bye [preauth]
Sep 30 18:52:21 compute-1 sshd-session[307192]: Disconnected from invalid user sakura 161.132.50.17 port 40506 [preauth]
Sep 30 18:52:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:21.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:22.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:22 compute-1 nova_compute[238822]: 2025-09-30 18:52:22.538 2 DEBUG oslo_concurrency.lockutils [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:22 compute-1 nova_compute[238822]: 2025-09-30 18:52:22.539 2 DEBUG oslo_concurrency.lockutils [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:22 compute-1 unix_chkpwd[307203]: password check failed for user (root)
Sep 30 18:52:22 compute-1 sshd-session[307197]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:52:22 compute-1 ceph-mon[75484]: pgmap v2225: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:52:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.047 2 DEBUG nova.objects.instance [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:52:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:23 compute-1 sudo[307205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:52:23 compute-1 sudo[307205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:52:23 compute-1 sudo[307205]: pam_unix(sudo:session): session closed for user root
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.564 2 INFO nova.compute.manager [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Detaching volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81
Sep 30 18:52:23 compute-1 podman[307230]: 2025-09-30 18:52:23.576365379 +0000 UTC m=+0.095216526 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:52:23 compute-1 podman[307228]: 2025-09-30 18:52:23.640880221 +0000 UTC m=+0.159871042 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:52:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.720 2 INFO nova.virt.block_device [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Attempting to driver detach volume 48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 from mountpoint /dev/vdb
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.736 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.740 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.741 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Attempting to detach device vdb from instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 from the persistent domain config. _detach_from_persistent /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2576
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.742 2 DEBUG nova.virt.libvirt.guest [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] detach device xml: <disk type="network" device="disk">
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <alias name="ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-4dca7d46-539f-4ca3-9a67-dfddb0bc9f79">
Sep 30 18:52:23 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   </source>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <serial>48cc17b7-3f84-4dd1-aa6f-dc29cea17e81</serial>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]: </disk>
Sep 30 18:52:23 compute-1 nova_compute[238822]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.756 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.756 2 WARNING nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Failed to detach device vdb from instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 from the persistent domain config. Libvirt did not report any error but the device is still in the config.
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.757 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] (1/8): Attempting to detach device vdb with device alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 from instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2612
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.758 2 DEBUG nova.virt.libvirt.guest [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] detach device xml: <disk type="network" device="disk">
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <alias name="ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-4dca7d46-539f-4ca3-9a67-dfddb0bc9f79">
Sep 30 18:52:23 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   </source>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <serial>48cc17b7-3f84-4dd1-aa6f-dc29cea17e81</serial>
Sep 30 18:52:23 compute-1 nova_compute[238822]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Sep 30 18:52:23 compute-1 nova_compute[238822]: </disk>
Sep 30 18:52:23 compute-1 nova_compute[238822]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Sep 30 18:52:23 compute-1 sshd-session[307201]: Invalid user deb from 103.153.190.105 port 51488
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:23 compute-1 sshd-session[307201]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:52:23 compute-1 sshd-session[307201]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:52:23 compute-1 nova_compute[238822]: 2025-09-30 18:52:23.911 2 DEBUG nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Start waiting for the detach event from libvirt for device vdb with device alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 for instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 _detach_from_live_and_wait_for_event /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2688
Sep 30 18:52:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:24.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:24 compute-1 ceph-mon[75484]: pgmap v2226: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 3.3 KiB/s wr, 1 op/s
Sep 30 18:52:24 compute-1 sshd-session[307197]: Failed password for root from 192.210.160.141 port 57982 ssh2
Sep 30 18:52:25 compute-1 sshd-session[307201]: Failed password for invalid user deb from 103.153.190.105 port 51488 ssh2
Sep 30 18:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:25 compute-1 podman[307287]: 2025-09-30 18:52:25.563174044 +0000 UTC m=+0.096007487 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 18:52:25 compute-1 sshd-session[307285]: Invalid user fivem from 8.243.64.201 port 37150
Sep 30 18:52:25 compute-1 sshd-session[307285]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:52:25 compute-1 sshd-session[307285]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:52:25 compute-1 sshd-session[307197]: Connection closed by authenticating user root 192.210.160.141 port 57982 [preauth]
Sep 30 18:52:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:25.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:25 compute-1 sshd-session[307201]: Received disconnect from 103.153.190.105 port 51488:11: Bye Bye [preauth]
Sep 30 18:52:25 compute-1 sshd-session[307201]: Disconnected from invalid user deb 103.153.190.105 port 51488 [preauth]
Sep 30 18:52:25 compute-1 nova_compute[238822]: 2025-09-30 18:52:25.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:26.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:26 compute-1 ceph-mon[75484]: pgmap v2227: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:52:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:27 compute-1 sshd-session[307307]: Invalid user titu from 49.49.32.245 port 37112
Sep 30 18:52:27 compute-1 sshd-session[307307]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:52:27 compute-1 sshd-session[307307]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:52:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:27.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:27 compute-1 sshd-session[307285]: Failed password for invalid user fivem from 8.243.64.201 port 37150 ssh2
Sep 30 18:52:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:28.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:28 compute-1 sshd-session[307285]: Received disconnect from 8.243.64.201 port 37150:11: Bye Bye [preauth]
Sep 30 18:52:28 compute-1 sshd-session[307285]: Disconnected from invalid user fivem 8.243.64.201 port 37150 [preauth]
Sep 30 18:52:28 compute-1 nova_compute[238822]: 2025-09-30 18:52:28.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:28 compute-1 ceph-mon[75484]: pgmap v2228: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:52:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:30 compute-1 sshd-session[307307]: Failed password for invalid user titu from 49.49.32.245 port 37112 ssh2
Sep 30 18:52:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:30.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:30 compute-1 sshd-session[307307]: Received disconnect from 49.49.32.245 port 37112:11: Bye Bye [preauth]
Sep 30 18:52:30 compute-1 sshd-session[307307]: Disconnected from invalid user titu 49.49.32.245 port 37112 [preauth]
Sep 30 18:52:30 compute-1 ceph-mon[75484]: pgmap v2229: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:52:30 compute-1 nova_compute[238822]: 2025-09-30 18:52:30.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:31.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:32.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:32 compute-1 podman[307316]: 2025-09-30 18:52:32.563715708 +0000 UTC m=+0.100504798 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:52:32 compute-1 podman[307318]: 2025-09-30 18:52:32.583065307 +0000 UTC m=+0.106563031 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930)
Sep 30 18:52:32 compute-1 podman[307317]: 2025-09-30 18:52:32.590164828 +0000 UTC m=+0.118980644 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:52:32 compute-1 ceph-mon[75484]: pgmap v2230: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:52:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:33.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:33 compute-1 nova_compute[238822]: 2025-09-30 18:52:33.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:33 compute-1 ceph-mon[75484]: pgmap v2231: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:52:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:35 compute-1 podman[249638]: time="2025-09-30T18:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:52:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37992 "" "Go-http-client/1.1"
Sep 30 18:52:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8836 "" "Go-http-client/1.1"
Sep 30 18:52:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:35.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:35 compute-1 nova_compute[238822]: 2025-09-30 18:52:35.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:36.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:36 compute-1 ceph-mon[75484]: pgmap v2232: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:52:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3635020971' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:52:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3635020971' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:52:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:52:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:37.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:38.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:38 compute-1 ceph-mon[75484]: pgmap v2233: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Sep 30 18:52:38 compute-1 nova_compute[238822]: 2025-09-30 18:52:38.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:39.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:40.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:40 compute-1 ceph-mon[75484]: pgmap v2234: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 938 B/s rd, 8.1 KiB/s wr, 2 op/s
Sep 30 18:52:40 compute-1 nova_compute[238822]: 2025-09-30 18:52:40.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:41.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:42.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:42 compute-1 ceph-mon[75484]: pgmap v2235: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:52:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:43 compute-1 sudo[307384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:52:43 compute-1 sudo[307384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:52:43 compute-1 sudo[307384]: pam_unix(sudo:session): session closed for user root
Sep 30 18:52:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:43.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:43 compute-1 nova_compute[238822]: 2025-09-30 18:52:43.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:43 compute-1 nova_compute[238822]: 2025-09-30 18:52:43.913 2 WARNING nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Waiting for libvirt event about the detach of device vdb with device alias ua-48cc17b7-3f84-4dd1-aa6f-dc29cea17e81 from instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 is timed out.
Sep 30 18:52:43 compute-1 nova_compute[238822]: 2025-09-30 18:52:43.922 2 INFO nova.virt.libvirt.driver [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully detached device vdb from instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 from the live domain config.
Sep 30 18:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:44.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:44 compute-1 ceph-mon[75484]: pgmap v2236: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:52:45 compute-1 nova_compute[238822]: 2025-09-30 18:52:45.140 2 DEBUG oslo_concurrency.lockutils [None req-0e74d080-b3b0-46c3-8f55-7643edb440e6 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 22.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:45.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:45 compute-1 nova_compute[238822]: 2025-09-30 18:52:45.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:46.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:46 compute-1 ceph-mon[75484]: pgmap v2237: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:52:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3596888471' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:52:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3596888471' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:52:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.573 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.573 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.574 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.574 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.575 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:47 compute-1 nova_compute[238822]: 2025-09-30 18:52:47.589 2 INFO nova.compute.manager [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Terminating instance
Sep 30 18:52:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:47.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.108 2 DEBUG nova.compute.manager [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:52:48 compute-1 kernel: tapbefcf2ec-34 (unregistering): left promiscuous mode
Sep 30 18:52:48 compute-1 NetworkManager[45549]: <info>  [1759258368.1761] device (tapbefcf2ec-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:52:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:48 compute-1 ovn_controller[135204]: 2025-09-30T18:52:48Z|00322|binding|INFO|Releasing lport befcf2ec-342b-4ac9-8ac7-d935271b42ac from this chassis (sb_readonly=0)
Sep 30 18:52:48 compute-1 ovn_controller[135204]: 2025-09-30T18:52:48Z|00323|binding|INFO|Setting lport befcf2ec-342b-4ac9-8ac7-d935271b42ac down in Southbound
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 ovn_controller[135204]: 2025-09-30T18:52:48Z|00324|binding|INFO|Removing iface tapbefcf2ec-34 ovn-installed in OVS
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.242 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:1e:87 10.100.0.9'], port_security=['fa:16:3e:38:1e:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ecdb485e-5297-4bbc-bed9-9f019a69b1e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '126a2d65-c072-4128-836f-db6080f798dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3bc33b-b1e3-4a2f-8784-2e8238744730, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=befcf2ec-342b-4ac9-8ac7-d935271b42ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.244 144543 INFO neutron.agent.ovn.metadata.agent [-] Port befcf2ec-342b-4ac9-8ac7-d935271b42ac in datapath cee64377-b6b9-46f2-8d77-c7978d4cc7a0 unbound from our chassis
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.246 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee64377-b6b9-46f2-8d77-c7978d4cc7a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.248 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[1685a559-930b-4bb8-a893-c86653033653]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.249 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 namespace which is not needed anymore
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:48 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000028.scope: Deactivated successfully.
Sep 30 18:52:48 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000028.scope: Consumed 17.380s CPU time.
Sep 30 18:52:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:48 compute-1 systemd-machined[195911]: Machine qemu-30-instance-00000028 terminated.
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.360 2 INFO nova.virt.libvirt.driver [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Instance destroyed successfully.
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.363 2 DEBUG nova.objects.instance [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'resources' on Instance uuid ecdb485e-5297-4bbc-bed9-9f019a69b1e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:52:48 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [NOTICE]   (306234) : haproxy version is 3.0.5-8e879a5
Sep 30 18:52:48 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [NOTICE]   (306234) : path to executable is /usr/sbin/haproxy
Sep 30 18:52:48 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [WARNING]  (306234) : Exiting Master process...
Sep 30 18:52:48 compute-1 podman[307453]: 2025-09-30 18:52:48.425777247 +0000 UTC m=+0.042537403 container kill 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:52:48 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [ALERT]    (306234) : Current worker (306236) exited with code 143 (Terminated)
Sep 30 18:52:48 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[306230]: [WARNING]  (306234) : All workers exited. Exiting... (0)
Sep 30 18:52:48 compute-1 systemd[1]: libpod-9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb.scope: Deactivated successfully.
Sep 30 18:52:48 compute-1 podman[307468]: 2025-09-30 18:52:48.488192102 +0000 UTC m=+0.037584309 container died 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 18:52:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-fe69756a8a39791c565ebc9a2b538789801a1ed231491420fbe4cf9e60b332bd-merged.mount: Deactivated successfully.
Sep 30 18:52:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb-userdata-shm.mount: Deactivated successfully.
Sep 30 18:52:48 compute-1 podman[307468]: 2025-09-30 18:52:48.537675851 +0000 UTC m=+0.087068008 container cleanup 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:52:48 compute-1 systemd[1]: libpod-conmon-9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb.scope: Deactivated successfully.
Sep 30 18:52:48 compute-1 podman[307470]: 2025-09-30 18:52:48.563034311 +0000 UTC m=+0.092585426 container remove 9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.573 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b50dd1-ff83-44a7-8fee-00576eea7c6c]: (4, ("Tue Sep 30 06:52:48 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 (9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb)\n9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb\nTue Sep 30 06:52:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 (9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb)\n9f3b2ec0cbad1bb09d5e9a716a10d5716f7bc22b7f6217beac36d4152ffecafb\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.575 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5d98de-3cbb-46bb-85b7-267495140d95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.575 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.576 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[713edfa6-7725-417a-afbe-0b8a64c92a66]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.577 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee64377-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 kernel: tapcee64377-b0: left promiscuous mode
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.607 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f029fab5-aba3-4388-a25f-e37486b793f6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.638 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[24c3c240-c41f-4169-a0ed-e7148bda5e59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.640 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fc7a5a-8ae0-4c69-93fc-b53b07e0db91]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.671 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ea7641ad-eee7-42aa-8333-65a3a94a4aea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1613855, 'reachable_time': 38000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307503, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 systemd[1]: run-netns-ovnmeta\x2dcee64377\x2db6b9\x2d46f2\x2d8d77\x2dc7978d4cc7a0.mount: Deactivated successfully.
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.679 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:52:48 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:48.679 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[78a43ff8-8fbb-4873-b57f-44f961c096d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:52:48 compute-1 ceph-mon[75484]: pgmap v2238: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 8.1 KiB/s wr, 1 op/s
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.872 2 DEBUG nova.virt.libvirt.vif [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:51:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1465799852',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-146579985',id=40,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:51:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-gfwma97m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:51:39Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=ecdb485e-5297-4bbc-bed9-9f019a69b1e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.873 2 DEBUG nova.network.os_vif_util [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "address": "fa:16:3e:38:1e:87", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbefcf2ec-34", "ovs_interfaceid": "befcf2ec-342b-4ac9-8ac7-d935271b42ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.874 2 DEBUG nova.network.os_vif_util [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.875 2 DEBUG os_vif [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.878 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbefcf2ec-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.885 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=34f0a051-ee89-4790-9397-45b4272b869d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.892 2 INFO os_vif [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:1e:87,bridge_name='br-int',has_traffic_filtering=True,id=befcf2ec-342b-4ac9-8ac7-d935271b42ac,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbefcf2ec-34')
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.941 2 DEBUG nova.compute.manager [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.942 2 DEBUG oslo_concurrency.lockutils [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.942 2 DEBUG oslo_concurrency.lockutils [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.943 2 DEBUG oslo_concurrency.lockutils [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.943 2 DEBUG nova.compute.manager [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] No waiting events found dispatching network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:52:48 compute-1 nova_compute[238822]: 2025-09-30 18:52:48.943 2 DEBUG nova.compute.manager [req-9e70defe-8852-4e3e-8b5d-e29f846c80bb req-f540a470-fa5b-40db-b203-0b47c0cb8dc2 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:52:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:49.015 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:52:49 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:49.015 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.364 2 INFO nova.virt.libvirt.driver [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Deleting instance files /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0_del
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.365 2 INFO nova.virt.libvirt.driver [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Deletion of /var/lib/nova/instances/ecdb485e-5297-4bbc-bed9-9f019a69b1e0_del complete
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: ERROR   18:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: ERROR   18:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: ERROR   18:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: ERROR   18:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: ERROR   18:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:52:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:52:49 compute-1 sshd-session[307413]: Invalid user netscreen from 192.210.160.141 port 38774
Sep 30 18:52:49 compute-1 sshd-session[307413]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:52:49 compute-1 sshd-session[307413]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:52:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.881 2 INFO nova.compute.manager [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Took 1.77 seconds to destroy the instance on the hypervisor.
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.881 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.882 2 DEBUG nova.compute.manager [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.882 2 DEBUG nova.network.neutron [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:52:49 compute-1 nova_compute[238822]: 2025-09-30 18:52:49.883 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.186 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:52:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:50.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.590 2 DEBUG nova.compute.manager [req-f323c7d8-a97d-498a-893d-39ad61afbe2d req-10dfab2f-8fde-420a-89ed-e7550bf55d27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-deleted-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.591 2 INFO nova.compute.manager [req-f323c7d8-a97d-498a-893d-39ad61afbe2d req-10dfab2f-8fde-420a-89ed-e7550bf55d27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Neutron deleted interface befcf2ec-342b-4ac9-8ac7-d935271b42ac; detaching it from the instance and deleting it from the info cache
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.591 2 DEBUG nova.network.neutron [req-f323c7d8-a97d-498a-893d-39ad61afbe2d req-10dfab2f-8fde-420a-89ed-e7550bf55d27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:52:50 compute-1 ceph-mon[75484]: pgmap v2239: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 8.3 KiB/s wr, 15 op/s
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.998 2 DEBUG nova.compute.manager [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.998 2 DEBUG oslo_concurrency.lockutils [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.999 2 DEBUG oslo_concurrency.lockutils [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.999 2 DEBUG oslo_concurrency.lockutils [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.999 2 DEBUG nova.compute.manager [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] No waiting events found dispatching network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:52:50 compute-1 nova_compute[238822]: 2025-09-30 18:52:50.999 2 DEBUG nova.compute.manager [req-b2aaed97-7644-4f96-b7e8-d5744362549d req-c2b56a19-d61e-40c2-a4a5-f0dd9c2d8c68 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Received event network-vif-unplugged-befcf2ec-342b-4ac9-8ac7-d935271b42ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:52:51 compute-1 nova_compute[238822]: 2025-09-30 18:52:51.014 2 DEBUG nova.network.neutron [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:52:51 compute-1 nova_compute[238822]: 2025-09-30 18:52:51.102 2 DEBUG nova.compute.manager [req-f323c7d8-a97d-498a-893d-39ad61afbe2d req-10dfab2f-8fde-420a-89ed-e7550bf55d27 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Detach interface failed, port_id=befcf2ec-342b-4ac9-8ac7-d935271b42ac, reason: Instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 18:52:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:51 compute-1 sshd-session[307413]: Failed password for invalid user netscreen from 192.210.160.141 port 38774 ssh2
Sep 30 18:52:51 compute-1 nova_compute[238822]: 2025-09-30 18:52:51.520 2 INFO nova.compute.manager [-] [instance: ecdb485e-5297-4bbc-bed9-9f019a69b1e0] Took 1.64 seconds to deallocate network for instance.
Sep 30 18:52:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:51.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:52 compute-1 nova_compute[238822]: 2025-09-30 18:52:52.054 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:52 compute-1 nova_compute[238822]: 2025-09-30 18:52:52.054 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:52 compute-1 nova_compute[238822]: 2025-09-30 18:52:52.115 2 DEBUG oslo_concurrency.processutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:52:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:52 compute-1 sshd-session[307413]: Connection closed by invalid user netscreen 192.210.160.141 port 38774 [preauth]
Sep 30 18:52:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:52:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:52.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:52:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:52:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2495173031' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:52 compute-1 nova_compute[238822]: 2025-09-30 18:52:52.644 2 DEBUG oslo_concurrency.processutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:52:52 compute-1 nova_compute[238822]: 2025-09-30 18:52:52.655 2 DEBUG nova.compute.provider_tree [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:52:52 compute-1 ceph-mon[75484]: pgmap v2240: 353 pgs: 353 active+clean; 121 MiB data, 468 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 255 B/s wr, 14 op/s
Sep 30 18:52:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:52:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2495173031' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:52:53 compute-1 nova_compute[238822]: 2025-09-30 18:52:53.171 2 DEBUG nova.scheduler.client.report [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:52:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:53 compute-1 nova_compute[238822]: 2025-09-30 18:52:53.682 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.627s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:53.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:53 compute-1 nova_compute[238822]: 2025-09-30 18:52:53.731 2 INFO nova.scheduler.client.report [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Deleted allocations for instance ecdb485e-5297-4bbc-bed9-9f019a69b1e0
Sep 30 18:52:53 compute-1 nova_compute[238822]: 2025-09-30 18:52:53.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:54.440 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:54.441 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:52:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:54.441 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:54 compute-1 podman[307554]: 2025-09-30 18:52:54.565345233 +0000 UTC m=+0.090931131 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:52:54 compute-1 podman[307552]: 2025-09-30 18:52:54.652933474 +0000 UTC m=+0.182665414 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 18:52:54 compute-1 nova_compute[238822]: 2025-09-30 18:52:54.775 2 DEBUG oslo_concurrency.lockutils [None req-8893beb0-0572-4f1c-87e9-83eb63bc0574 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "ecdb485e-5297-4bbc-bed9-9f019a69b1e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.202s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:52:54 compute-1 ceph-mon[75484]: pgmap v2241: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:52:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:55.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3487887483' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:52:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3487887483' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:52:56 compute-1 nova_compute[238822]: 2025-09-30 18:52:56.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:56.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:56 compute-1 podman[307604]: 2025-09-30 18:52:56.559454356 +0000 UTC m=+0.094190479 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:52:56 compute-1 ceph-mon[75484]: pgmap v2242: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:52:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 18:52:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1694317258' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:52:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 18:52:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1694317258' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:52:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:52:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:57.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:52:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1694317258' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:52:57 compute-1 ceph-mon[75484]: pgmap v2243: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:52:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1694317258' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:52:58 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:52:58.017 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:52:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:52:58.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:52:58 compute-1 nova_compute[238822]: 2025-09-30 18:52:58.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:52:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:52:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:52:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:52:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:52:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:52:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:52:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:52:59.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:00.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:00 compute-1 ceph-mon[75484]: pgmap v2244: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 40 KiB/s rd, 2.0 KiB/s wr, 56 op/s
Sep 30 18:53:01 compute-1 nova_compute[238822]: 2025-09-30 18:53:01.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:01 compute-1 nova_compute[238822]: 2025-09-30 18:53:01.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:01 compute-1 nova_compute[238822]: 2025-09-30 18:53:01.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:53:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:01.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:02.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.576 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.577 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.592 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:02 compute-1 nova_compute[238822]: 2025-09-30 18:53:02.594 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:02 compute-1 ceph-mon[75484]: pgmap v2245: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Sep 30 18:53:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:53:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1153435571' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.083 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.115 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 18:53:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:03 compute-1 podman[307654]: 2025-09-30 18:53:03.251396247 +0000 UTC m=+0.100366865 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Sep 30 18:53:03 compute-1 podman[307655]: 2025-09-30 18:53:03.26308555 +0000 UTC m=+0.109054608 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6)
Sep 30 18:53:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:03 compute-1 podman[307656]: 2025-09-30 18:53:03.281866494 +0000 UTC m=+0.113733653 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.352 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.354 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.403 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.404 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4582MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.404 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.404 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.688 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1153435571' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1395640280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:03 compute-1 sudo[307712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:53:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:03.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:03 compute-1 sudo[307712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:03 compute-1 sudo[307712]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:03 compute-1 nova_compute[238822]: 2025-09-30 18:53:03.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:53:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:53:04 compute-1 ceph-mon[75484]: pgmap v2246: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Sep 30 18:53:04 compute-1 nova_compute[238822]: 2025-09-30 18:53:04.962 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 3daba7c9-ccac-4d03-a63b-2f978730a440 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Sep 30 18:53:04 compute-1 nova_compute[238822]: 2025-09-30 18:53:04.962 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:53:04 compute-1 nova_compute[238822]: 2025-09-30 18:53:04.963 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:53:03 up  4:30,  0 user,  load average: 0.33, 0.38, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:53:04 compute-1 nova_compute[238822]: 2025-09-30 18:53:04.990 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:05 compute-1 sudo[307739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:53:05 compute-1 sudo[307739]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:05 compute-1 sudo[307739]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:05 compute-1 sudo[307765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:53:05 compute-1 sudo[307765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:53:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3024213687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:05 compute-1 nova_compute[238822]: 2025-09-30 18:53:05.521 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:05 compute-1 nova_compute[238822]: 2025-09-30 18:53:05.531 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:53:05 compute-1 podman[249638]: time="2025-09-30T18:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:53:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:53:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8367 "" "Go-http-client/1.1"
Sep 30 18:53:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2652299208' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3024213687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:05.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:05 compute-1 sudo[307765]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.076 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.596 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.597 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.192s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.597 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 2.909s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.607 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 18:53:06 compute-1 nova_compute[238822]: 2025-09-30 18:53:06.608 2 INFO nova.compute.claims [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Claim successful on node compute-1.ctlplane.example.com
Sep 30 18:53:06 compute-1 ceph-mon[75484]: pgmap v2247: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 597 B/s wr, 14 op/s
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:53:06 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:53:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:07 compute-1 nova_compute[238822]: 2025-09-30 18:53:07.598 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:07 compute-1 nova_compute[238822]: 2025-09-30 18:53:07.598 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:07.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:07 compute-1 ceph-mon[75484]: pgmap v2248: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 695 B/s wr, 17 op/s
Sep 30 18:53:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:53:07 compute-1 nova_compute[238822]: 2025-09-30 18:53:07.918 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:08.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:53:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1519403048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:08 compute-1 nova_compute[238822]: 2025-09-30 18:53:08.449 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:08 compute-1 nova_compute[238822]: 2025-09-30 18:53:08.459 2 DEBUG nova.compute.provider_tree [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:53:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1519403048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:53:08 compute-1 nova_compute[238822]: 2025-09-30 18:53:08.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:08 compute-1 nova_compute[238822]: 2025-09-30 18:53:08.972 2 DEBUG nova.scheduler.client.report [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:53:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:09 compute-1 nova_compute[238822]: 2025-09-30 18:53:09.485 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.888s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:09 compute-1 nova_compute[238822]: 2025-09-30 18:53:09.486 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 18:53:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:53:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:09.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:53:09 compute-1 ceph-mon[75484]: pgmap v2249: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 13 KiB/s rd, 695 B/s wr, 17 op/s
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.000 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.000 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.001 2 WARNING neutronclient.v2_0.client [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.002 2 WARNING neutronclient.v2_0.client [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.517 2 INFO nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 18:53:10 compute-1 nova_compute[238822]: 2025-09-30 18:53:10.665 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Successfully created port: d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.026 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:11 compute-1 sudo[307873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:53:11 compute-1 sudo[307873]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:11 compute-1 sudo[307873]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.344 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Successfully updated port: d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.387854) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391387909, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 1407, "num_deletes": 250, "total_data_size": 3277632, "memory_usage": 3357720, "flush_reason": "Manual Compaction"}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391400785, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 1349942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59933, "largest_seqno": 61335, "table_properties": {"data_size": 1345236, "index_size": 2102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12891, "raw_average_key_size": 21, "raw_value_size": 1334772, "raw_average_value_size": 2191, "num_data_blocks": 93, "num_entries": 609, "num_filter_entries": 609, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258280, "oldest_key_time": 1759258280, "file_creation_time": 1759258391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 12999 microseconds, and 7526 cpu microseconds.
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.400851) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 1349942 bytes OK
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.400879) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.403065) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.403148) EVENT_LOG_v1 {"time_micros": 1759258391403132, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.403188) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 3270965, prev total WAL file size 3270965, number of live WAL files 2.
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.405765) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303037' seq:72057594037927935, type:22 .. '6D6772737461740032323538' seq:0, type:0; will stop at (end)
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(1318KB)], [123(12MB)]
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391405832, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 14575647, "oldest_snapshot_seqno": -1}
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.429 2 DEBUG nova.compute.manager [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-changed-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.430 2 DEBUG nova.compute.manager [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Refreshing instance network info cache due to event network-changed-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.431 2 DEBUG oslo_concurrency.lockutils [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.431 2 DEBUG oslo_concurrency.lockutils [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquired lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.431 2 DEBUG nova.network.neutron [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Refreshing network info cache for port d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 7788 keys, 11497359 bytes, temperature: kUnknown
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391482537, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 11497359, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11451740, "index_size": 25027, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19525, "raw_key_size": 206481, "raw_average_key_size": 26, "raw_value_size": 11318843, "raw_average_value_size": 1453, "num_data_blocks": 966, "num_entries": 7788, "num_filter_entries": 7788, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.483068) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 11497359 bytes
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.484665) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.4 rd, 149.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.6 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(19.3) write-amplify(8.5) OK, records in: 8257, records dropped: 469 output_compression: NoCompression
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.484698) EVENT_LOG_v1 {"time_micros": 1759258391484682, "job": 78, "event": "compaction_finished", "compaction_time_micros": 76959, "compaction_time_cpu_micros": 49743, "output_level": 6, "num_output_files": 1, "total_output_size": 11497359, "num_input_records": 8257, "num_output_records": 7788, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391485341, "job": 78, "event": "table_file_deletion", "file_number": 125}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258391490286, "job": 78, "event": "table_file_deletion", "file_number": 123}
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.405578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.490456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.490470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.490474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.490479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:53:11.490484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:53:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:11.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:11 compute-1 ceph-mon[75484]: pgmap v2250: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 596 B/s rd, 0 op/s
Sep 30 18:53:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:53:11 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.851 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 18:53:11 compute-1 nova_compute[238822]: 2025-09-30 18:53:11.951 2 WARNING neutronclient.v2_0.client [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.044 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.046 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.047 2 INFO nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Creating image(s)
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.089 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.130 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.170 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.176 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.196 2 DEBUG nova.network.neutron [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:53:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.266 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.267 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "cb2d580238c9b109feae7f1462613dc547671457" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.268 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.269 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "cb2d580238c9b109feae7f1462613dc547671457" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.307 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.314 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 3daba7c9-ccac-4d03-a63b-2f978730a440_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.626 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb2d580238c9b109feae7f1462613dc547671457 3daba7c9-ccac-4d03-a63b-2f978730a440_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.311s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.730 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] resizing rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk to 1073741824 resize /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:288
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.880 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.881 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Ensure instance console log exists: /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.882 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.883 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:12 compute-1 nova_compute[238822]: 2025-09-30 18:53:12.884 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.426 2 DEBUG nova.network.neutron [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:53:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:13.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:13 compute-1 ceph-mon[75484]: pgmap v2251: 353 pgs: 353 active+clean; 41 MiB data, 422 MiB used, 40 GiB / 40 GiB avail; 596 B/s rd, 0 op/s
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.934 2 DEBUG oslo_concurrency.lockutils [req-89f1e5a5-3fd7-4beb-a650-012415255199 req-7e03932d-309a-4821-9c21-f2e580212c21 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Releasing lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.935 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquired lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 18:53:13 compute-1 nova_compute[238822]: 2025-09-30 18:53:13.936 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 18:53:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:14.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:15 compute-1 nova_compute[238822]: 2025-09-30 18:53:15.215 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 18:53:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:15 compute-1 nova_compute[238822]: 2025-09-30 18:53:15.488 2 WARNING neutronclient.v2_0.client [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:53:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:15.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:15 compute-1 ceph-mon[75484]: pgmap v2252: 353 pgs: 353 active+clean; 88 MiB data, 433 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.359 2 DEBUG nova.network.neutron [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Updating instance_info_cache with network_info: [{"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.870 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Releasing lock "refresh_cache-3daba7c9-ccac-4d03-a63b-2f978730a440" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.871 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance network_info: |[{"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.875 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Start _get_guest_xml network_info=[{"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'size': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'image_id': '5b99cbca-b655-4be5-8343-cf504005c42e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.882 2 WARNING nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.885 2 DEBUG nova.virt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5b99cbca-b655-4be5-8343-cf504005c42e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1848850950', uuid='3daba7c9-ccac-4d03-a63b-2f978730a440'), owner=OwnerMeta(userid='e80b7fccb5a34c13b356857340eff1ee', username='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin', projectid='127ca83529de45efa0a76aa8ceefcd3d', projectname='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540'), image=ImageMeta(id='5b99cbca-b655-4be5-8343-cf504005c42e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759258396.8850863) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.890 2 DEBUG nova.virt.libvirt.host [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.891 2 DEBUG nova.virt.libvirt.host [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.896 2 DEBUG nova.virt.libvirt.host [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.897 2 DEBUG nova.virt.libvirt.host [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.898 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.898 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T18:04:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c83dc7f1-0795-47db-adcb-fb90be11684a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T18:04:51Z,direct_url=<?>,disk_format='qcow2',id=5b99cbca-b655-4be5-8343-cf504005c42e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e2dde567e5c4b1c9802c64cfc281b6d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T18:04:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.899 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.899 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.900 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.900 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.901 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.901 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.901 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.902 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.902 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.903 2 DEBUG nova.virt.hardware [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 18:53:16 compute-1 nova_compute[238822]: 2025-09-30 18:53:16.907 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:53:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4194731416' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.438 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.467 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.472 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:17.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:17 compute-1 ceph-mon[75484]: pgmap v2253: 353 pgs: 353 active+clean; 88 MiB data, 433 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Sep 30 18:53:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4194731416' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:53:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Sep 30 18:53:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3814294932' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.947 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.949 2 DEBUG nova.virt.libvirt.vif [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:53:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1848850950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-184885095',id=41,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-tckvj671',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:53:11Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=3daba7c9-ccac-4d03-a63b-2f978730a440,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.949 2 DEBUG nova.network.os_vif_util [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.950 2 DEBUG nova.network.os_vif_util [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:53:17 compute-1 nova_compute[238822]: 2025-09-30 18:53:17.951 2 DEBUG nova.objects.instance [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3daba7c9-ccac-4d03-a63b-2f978730a440 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:53:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:18.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.465 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] End _get_guest_xml xml=<domain type="kvm">
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <uuid>3daba7c9-ccac-4d03-a63b-2f978730a440</uuid>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <name>instance-00000029</name>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <memory>131072</memory>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <vcpu>1</vcpu>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <metadata>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:name>tempest-TestExecuteZoneMigrationStrategyVolume-server-1848850950</nova:name>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:creationTime>2025-09-30 18:53:16</nova:creationTime>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:flavor name="m1.nano" id="c83dc7f1-0795-47db-adcb-fb90be11684a">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:memory>128</nova:memory>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:disk>1</nova:disk>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:swap>0</nova:swap>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:vcpus>1</nova:vcpus>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:extraSpecs>
Sep 30 18:53:18 compute-1 nova_compute[238822]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         </nova:extraSpecs>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </nova:flavor>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:image uuid="5b99cbca-b655-4be5-8343-cf504005c42e">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:minDisk>1</nova:minDisk>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:minRam>0</nova:minRam>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:properties>
Sep 30 18:53:18 compute-1 nova_compute[238822]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         </nova:properties>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </nova:image>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:owner>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:user uuid="e80b7fccb5a34c13b356857340eff1ee">tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin</nova:user>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:project uuid="127ca83529de45efa0a76aa8ceefcd3d">tempest-TestExecuteZoneMigrationStrategyVolume-1619382540</nova:project>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </nova:owner>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:root type="image" uuid="5b99cbca-b655-4be5-8343-cf504005c42e"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <nova:ports>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <nova:port uuid="d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c">
Sep 30 18:53:18 compute-1 nova_compute[238822]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         </nova:port>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </nova:ports>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </nova:instance>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </metadata>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <sysinfo type="smbios">
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <system>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="manufacturer">RDO</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="product">OpenStack Compute</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="serial">3daba7c9-ccac-4d03-a63b-2f978730a440</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="uuid">3daba7c9-ccac-4d03-a63b-2f978730a440</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <entry name="family">Virtual Machine</entry>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </system>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </sysinfo>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <os>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <boot dev="hd"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <smbios mode="sysinfo"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </os>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <features>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <acpi/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <apic/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <vmcoreinfo/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </features>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <clock offset="utc">
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <timer name="hpet" present="no"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </clock>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <cpu mode="host-model" match="exact">
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </cpu>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   <devices>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <disk type="network" device="disk">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/3daba7c9-ccac-4d03-a63b-2f978730a440_disk">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <target dev="vda" bus="virtio"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <disk type="network" device="cdrom">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <driver type="raw" cache="none"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <source protocol="rbd" name="vms/3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <host name="192.168.122.100" port="6789"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <host name="192.168.122.101" port="6789"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </source>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <auth username="openstack">
Sep 30 18:53:18 compute-1 nova_compute[238822]:         <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       </auth>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <target dev="sda" bus="sata"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </disk>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <interface type="ethernet">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <mac address="fa:16:3e:6c:f2:56"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <mtu size="1442"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <target dev="tapd41448fe-ba"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </interface>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <serial type="pty">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <log file="/var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/console.log" append="off"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </serial>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <video>
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <model type="virtio"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </video>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <input type="tablet" bus="usb"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <rng model="virtio">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <backend model="random">/dev/urandom</backend>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </rng>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <controller type="usb" index="0"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 18:53:18 compute-1 nova_compute[238822]:       <stats period="10"/>
Sep 30 18:53:18 compute-1 nova_compute[238822]:     </memballoon>
Sep 30 18:53:18 compute-1 nova_compute[238822]:   </devices>
Sep 30 18:53:18 compute-1 nova_compute[238822]: </domain>
Sep 30 18:53:18 compute-1 nova_compute[238822]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.468 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Preparing to wait for external event network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.469 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.469 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.470 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.471 2 DEBUG nova.virt.libvirt.vif [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T18:53:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1848850950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-184885095',id=41,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-tckvj671',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T18:53:11Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=3daba7c9-ccac-4d03-a63b-2f978730a440,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.471 2 DEBUG nova.network.os_vif_util [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.472 2 DEBUG nova.network.os_vif_util [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.473 2 DEBUG os_vif [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.476 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.477 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9e32f685-1ed9-5aee-9a38-563306d71466', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd41448fe-ba, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.491 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd41448fe-ba, col_values=(('qos', UUID('4aa33ddf-ea57-447b-857d-1403831ce0f0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.491 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd41448fe-ba, col_values=(('external_ids', {'iface-id': 'd41448fe-bacd-4dda-bdd5-1ea7f20c9c7c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:f2:56', 'vm-uuid': '3daba7c9-ccac-4d03-a63b-2f978730a440'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 NetworkManager[45549]: <info>  [1759258398.4944] manager: (tapd41448fe-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:18 compute-1 nova_compute[238822]: 2025-09-30 18:53:18.511 2 INFO os_vif [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba')
Sep 30 18:53:18 compute-1 sshd-session[307847]: error: kex_exchange_identification: read: Connection timed out
Sep 30 18:53:18 compute-1 sshd-session[307847]: banner exchange: Connection from 183.245.9.13 port 43562: Connection timed out
Sep 30 18:53:18 compute-1 unix_chkpwd[308137]: password check failed for user (root)
Sep 30 18:53:18 compute-1 sshd-session[308068]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:53:18 compute-1 ceph-mon[75484]: pgmap v2254: 353 pgs: 353 active+clean; 88 MiB data, 433 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:53:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3814294932' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:53:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: ERROR   18:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: ERROR   18:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: ERROR   18:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: ERROR   18:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: ERROR   18:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:53:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:53:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:53:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:19.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.058 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.059 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.059 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No VIF found with MAC fa:16:3e:6c:f2:56, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.060 2 INFO nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Using config drive
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.096 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:20.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:20 compute-1 sshd-session[308068]: Failed password for root from 192.210.160.141 port 35374 ssh2
Sep 30 18:53:20 compute-1 nova_compute[238822]: 2025-09-30 18:53:20.622 2 WARNING neutronclient.v2_0.client [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:53:20 compute-1 ceph-mon[75484]: pgmap v2255: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.585 2 INFO nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Creating config drive at /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.597 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpfn5kgc7z execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:21 compute-1 sshd-session[308068]: Connection closed by authenticating user root 192.210.160.141 port 35374 [preauth]
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.750 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpfn5kgc7z" returned: 0 in 0.153s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:21.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.797 2 DEBUG nova.storage.rbd_utils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] rbd image 3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config does not exist __init__ /usr/lib/python3.12/site-packages/nova/storage/rbd_utils.py:80
Sep 30 18:53:21 compute-1 nova_compute[238822]: 2025-09-30 18:53:21.802 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config 3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.011 2 DEBUG oslo_concurrency.processutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config 3daba7c9-ccac-4d03-a63b-2f978730a440_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.209s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.012 2 INFO nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Deleting local config drive /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440/disk.config because it was imported into RBD.
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:53:22 compute-1 kernel: tapd41448fe-ba: entered promiscuous mode
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.1021] manager: (tapd41448fe-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Sep 30 18:53:22 compute-1 ovn_controller[135204]: 2025-09-30T18:53:22Z|00325|binding|INFO|Claiming lport d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c for this chassis.
Sep 30 18:53:22 compute-1 ovn_controller[135204]: 2025-09-30T18:53:22Z|00326|binding|INFO|d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c: Claiming fa:16:3e:6c:f2:56 10.100.0.12
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.114 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:f2:56 10.100.0.12'], port_security=['fa:16:3e:6c:f2:56 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3daba7c9-ccac-4d03-a63b-2f978730a440', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '126a2d65-c072-4128-836f-db6080f798dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3bc33b-b1e3-4a2f-8784-2e8238744730, chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.116 144543 INFO neutron.agent.ovn.metadata.agent [-] Port d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c in datapath cee64377-b6b9-46f2-8d77-c7978d4cc7a0 bound to our chassis
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.119 144543 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:53:22 compute-1 ovn_controller[135204]: 2025-09-30T18:53:22Z|00327|binding|INFO|Setting lport d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c ovn-installed in OVS
Sep 30 18:53:22 compute-1 ovn_controller[135204]: 2025-09-30T18:53:22Z|00328|binding|INFO|Setting lport d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c up in Southbound
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.135 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[b1758d6f-4d76-4193-a883-c09d02068f55]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.136 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcee64377-b1 in ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.139 262759 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcee64377-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.139 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[d134506e-451b-48b4-9364-e168d971184b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.142 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc81d3c-4123-4035-b56f-6c333e529629]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.167 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[9739e58a-947e-484e-b26d-04d9c75efa55]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 systemd-machined[195911]: New machine qemu-31-instance-00000029.
Sep 30 18:53:22 compute-1 systemd-udevd[308214]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:53:22 compute-1 systemd[1]: Started Virtual Machine qemu-31-instance-00000029.
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.186 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[4af8b356-a904-49c6-8425-50f0891ba7fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.2060] device (tapd41448fe-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.2082] device (tapd41448fe-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 18:53:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.229 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4f6a8e-32e2-482a-a72e-8a21aa44d82f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.232 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[91dbc9e7-f2ab-4fa9-a97e-0c44e8ed676b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.2336] manager: (tapcee64377-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Sep 30 18:53:22 compute-1 systemd-udevd[308221]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 18:53:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.269 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[153c0c78-5a11-4ee2-8997-2f9bd2a49968]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.271 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[36a95101-2150-4a14-b30c-ebbecc8cd0d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.2992] device (tapcee64377-b0): carrier: link connected
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.308 268403 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe65a0e-b7cd-4391-8bbd-f7c1edf84497]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.327 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[2978f2c1-c24f-4674-ab2a-84163874e78c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee64377-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:48:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1624388, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308246, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:22.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.343 2 DEBUG nova.compute.manager [req-48454eb1-dcae-467b-b5a0-c0c488db27f2 req-0db0d8e8-095d-41bd-862b-2fd2c17e377b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.344 2 DEBUG oslo_concurrency.lockutils [req-48454eb1-dcae-467b-b5a0-c0c488db27f2 req-0db0d8e8-095d-41bd-862b-2fd2c17e377b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.344 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9ca7fa-0926-4b48-86ff-e85758e129a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:48c5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1624388, 'tstamp': 1624388}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308247, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.344 2 DEBUG oslo_concurrency.lockutils [req-48454eb1-dcae-467b-b5a0-c0c488db27f2 req-0db0d8e8-095d-41bd-862b-2fd2c17e377b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.345 2 DEBUG oslo_concurrency.lockutils [req-48454eb1-dcae-467b-b5a0-c0c488db27f2 req-0db0d8e8-095d-41bd-862b-2fd2c17e377b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.345 2 DEBUG nova.compute.manager [req-48454eb1-dcae-467b-b5a0-c0c488db27f2 req-0db0d8e8-095d-41bd-862b-2fd2c17e377b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Processing event network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.360 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[8b640d8c-9374-423d-a970-e419a69efbd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee64377-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:48:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1624388, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308248, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.390 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6f8f1c-cf62-4417-9f96-f1ec4a3d3929]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.462 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[7efba229-cfdc-4592-a925-8af40671a153]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.463 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee64377-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.463 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.464 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcee64377-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:22 compute-1 NetworkManager[45549]: <info>  [1759258402.4667] manager: (tapcee64377-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Sep 30 18:53:22 compute-1 kernel: tapcee64377-b0: entered promiscuous mode
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.470 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcee64377-b0, col_values=(('external_ids', {'iface-id': 'b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 ovn_controller[135204]: 2025-09-30T18:53:22Z|00329|binding|INFO|Releasing lport b27ac1cf-5e47-45c4-b2a7-c18dab16f3aa from this chassis (sb_readonly=0)
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 nova_compute[238822]: 2025-09-30 18:53:22.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.503 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[f9353e68-6b5d-4cdc-9871-e1e7bc87d63c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.504 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.505 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.505 144543 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for cee64377-b6b9-46f2-8d77-c7978d4cc7a0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.505 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.506 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ff915678-f69f-4f50-bc54-ccedbb6a837b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.507 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.508 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[0024a7dd-cc31-4484-ba4b-a82dcf10d4a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.509 144543 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: global
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     log         /dev/log local0 debug
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     log-tag     haproxy-metadata-proxy-cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     user        root
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     group       root
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     maxconn     1024
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     pidfile     /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     daemon
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: defaults
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     log global
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     mode http
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     option httplog
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     option dontlognull
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     option http-server-close
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     option forwardfor
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     retries                 3
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     timeout http-request    30s
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     timeout connect         30s
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     timeout client          32s
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     timeout server          32s
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     timeout http-keep-alive 30s
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: listen listener
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     bind 169.254.169.254:80
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:     http-request add-header X-OVN-Network-ID cee64377-b6b9-46f2-8d77-c7978d4cc7a0
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 18:53:22 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:22.511 144543 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'env', 'PROCESS_TAG=haproxy-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 18:53:22 compute-1 ceph-mon[75484]: pgmap v2256: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Sep 30 18:53:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:53:23 compute-1 podman[308322]: 2025-09-30 18:53:23.008927457 +0000 UTC m=+0.090724126 container create 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 18:53:23 compute-1 systemd[1]: Started libpod-conmon-8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9.scope.
Sep 30 18:53:23 compute-1 podman[308322]: 2025-09-30 18:53:22.967192047 +0000 UTC m=+0.048988776 image pull 8925e336c8a9c8e5b40d98e4715ad60992d6f21ad0a398170a686c34f922c024 38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 18:53:23 compute-1 systemd[1]: Started libcrun container.
Sep 30 18:53:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c5e62a4517a75e3f2e289792d31a2fcf5e0600b9ea6c7517c613110e9515175/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.104 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.109 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.114 2 INFO nova.virt.libvirt.driver [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance spawned successfully.
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.114 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 18:53:23 compute-1 podman[308322]: 2025-09-30 18:53:23.120276376 +0000 UTC m=+0.202073085 container init 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:53:23 compute-1 podman[308322]: 2025-09-30 18:53:23.128809855 +0000 UTC m=+0.210606504 container start 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 18:53:23 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [NOTICE]   (308343) : New worker (308345) forked
Sep 30 18:53:23 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [NOTICE]   (308343) : Loading success.
Sep 30 18:53:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.631 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.632 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.632 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.633 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.633 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 nova_compute[238822]: 2025-09-30 18:53:23.633 2 DEBUG nova.virt.libvirt.driver [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 18:53:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:23.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:23 compute-1 sudo[308354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:53:23 compute-1 sudo[308354]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:23 compute-1 sudo[308354]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.143 2 INFO nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Took 12.10 seconds to spawn the instance on the hypervisor.
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.145 2 DEBUG nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 18:53:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.431 2 DEBUG nova.compute.manager [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.431 2 DEBUG oslo_concurrency.lockutils [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.432 2 DEBUG oslo_concurrency.lockutils [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.432 2 DEBUG oslo_concurrency.lockutils [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.432 2 DEBUG nova.compute.manager [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] No waiting events found dispatching network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.432 2 WARNING nova.compute.manager [req-200ba1f1-68e2-4975-b744-8ebf998ee678 req-d11b0680-3528-43aa-a33d-feafe26870d5 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received unexpected event network-vif-plugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c for instance with vm_state active and task_state None.
Sep 30 18:53:24 compute-1 nova_compute[238822]: 2025-09-30 18:53:24.680 2 INFO nova.compute.manager [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Took 21.05 seconds to build instance.
Sep 30 18:53:25 compute-1 ceph-mon[75484]: pgmap v2257: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Sep 30 18:53:25 compute-1 nova_compute[238822]: 2025-09-30 18:53:25.184 2 DEBUG oslo_concurrency.lockutils [None req-334adb0c-a27e-4923-a57e-5f1ed2edc101 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:25 compute-1 podman[308382]: 2025-09-30 18:53:25.551803149 +0000 UTC m=+0.090232663 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 18:53:25 compute-1 podman[308381]: 2025-09-30 18:53:25.629166945 +0000 UTC m=+0.163761676 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 18:53:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:25.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:26 compute-1 nova_compute[238822]: 2025-09-30 18:53:26.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:26.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:27 compute-1 ceph-mon[75484]: pgmap v2258: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:53:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:27 compute-1 podman[308433]: 2025-09-30 18:53:27.561240091 +0000 UTC m=+0.089824491 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:53:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:27.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:28 compute-1 sshd-session[308439]: Invalid user sales1 from 161.132.50.17 port 36610
Sep 30 18:53:28 compute-1 sshd-session[308439]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:53:28 compute-1 sshd-session[308439]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:53:28 compute-1 nova_compute[238822]: 2025-09-30 18:53:28.092 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:28 compute-1 nova_compute[238822]: 2025-09-30 18:53:28.093 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:28.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:28 compute-1 nova_compute[238822]: 2025-09-30 18:53:28.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:28 compute-1 nova_compute[238822]: 2025-09-30 18:53:28.606 2 DEBUG nova.objects.instance [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid 3daba7c9-ccac-4d03-a63b-2f978730a440 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:53:29 compute-1 ceph-mon[75484]: pgmap v2259: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 7.7 KiB/s rd, 13 KiB/s wr, 10 op/s
Sep 30 18:53:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:29 compute-1 nova_compute[238822]: 2025-09-30 18:53:29.622 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.529s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:29.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:30 compute-1 sshd-session[308439]: Failed password for invalid user sales1 from 161.132.50.17 port 36610 ssh2
Sep 30 18:53:30 compute-1 nova_compute[238822]: 2025-09-30 18:53:30.808 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:30 compute-1 nova_compute[238822]: 2025-09-30 18:53:30.808 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:30 compute-1 nova_compute[238822]: 2025-09-30 18:53:30.809 2 INFO nova.compute.manager [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Attaching volume 67563176-066b-4509-9bae-1c4683a93b12 to /dev/vdb
Sep 30 18:53:30 compute-1 nova_compute[238822]: 2025-09-30 18:53:30.810 2 DEBUG nova.objects.instance [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid 3daba7c9-ccac-4d03-a63b-2f978730a440 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:53:31 compute-1 ceph-mon[75484]: pgmap v2260: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:31 compute-1 sshd-session[308439]: Received disconnect from 161.132.50.17 port 36610:11: Bye Bye [preauth]
Sep 30 18:53:31 compute-1 sshd-session[308439]: Disconnected from invalid user sales1 161.132.50.17 port 36610 [preauth]
Sep 30 18:53:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.724 2 DEBUG os_brick.utils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.727 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.746 8181 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.747 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[d9158231-3ecb-4c04-9017-433624b4ca62]: (4, ('InitiatorName=iqn.1994-05.com.redhat:d7bbbc2a579e', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.749 8181 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[b05fcb8b-4e81-480c-9388-578008563537]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Sep 30 18:53:31 compute-1 nova_compute[238822]: Traceback (most recent call last):
Sep 30 18:53:31 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Sep 30 18:53:31 compute-1 nova_compute[238822]:     ret = func(*f_args, **f_kwargs)
Sep 30 18:53:31 compute-1 nova_compute[238822]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:53:31 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Sep 30 18:53:31 compute-1 nova_compute[238822]:     return func(*args, **kwargs)
Sep 30 18:53:31 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:53:31 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Sep 30 18:53:31 compute-1 nova_compute[238822]:     with open_scini_device() as fd:
Sep 30 18:53:31 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^
Sep 30 18:53:31 compute-1 nova_compute[238822]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Sep 30 18:53:31 compute-1 nova_compute[238822]:     return next(self.gen)
Sep 30 18:53:31 compute-1 nova_compute[238822]:            ^^^^^^^^^^^^^^
Sep 30 18:53:31 compute-1 nova_compute[238822]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Sep 30 18:53:31 compute-1 nova_compute[238822]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Sep 30 18:53:31 compute-1 nova_compute[238822]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Sep 30 18:53:31 compute-1 nova_compute[238822]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.752 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[b05fcb8b-4e81-480c-9388-578008563537]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.754 2 ERROR os_brick.initiator.connectors.scaleio [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.756 2 INFO os_brick.initiator.connectors.scaleio [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.756 8181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.776 8181 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.777 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb7ab12-5447-4e29-b076-1c8fa2afd254]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.778 8181 DEBUG oslo.privsep.daemon [-] privsep: reply[53747f4d-05e6-4132-a3ee-e001dae7dcd3]: (4, '12ce99da-db91-4763-aecd-1e4b4dea5907') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.779 2 DEBUG oslo_concurrency.processutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:53:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:31.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.820 2 DEBUG oslo_concurrency.processutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "nvme version" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.825 2 DEBUG os_brick.initiator.connectors.lightos [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.829 2 INFO os_brick.initiator.connectors.lightos [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 and IP(s) are ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe6c:f256', 'fe80::d815:4ff:fe9e:89a4'] 
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.829 2 DEBUG os_brick.initiator.connectors.lightos [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.830 2 DEBUG os_brick.initiator.connectors.lightos [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1 dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.830 2 DEBUG os_brick.utils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] <== get_connector_properties: return (106ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:d7bbbc2a579e', 'do_local_attach': False, 'nvme_hostid': 'abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'system uuid': '12ce99da-db91-4763-aecd-1e4b4dea5907', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:abc6dbd1-bb80-4444-a621-a0ff0df4b0b1', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.102', 'fe80::f816:3eff:feac:ccc9', '192.168.122.101', 'fe80::f816:3eff:fe94:c4d1', '172.19.0.101', 'fe80::3422:55ff:fe9c:82fe', '172.20.0.101', 'fe80::207f:eeff:fe24:3a42', '172.17.0.101', 'fe80::802d:e1ff:fe9f:c0b8', '172.18.0.101', 'fe80::6826:b6ff:fe2d:8355', 'fe80::b4df:6dff:fea0:9ac', 'fe80::fc16:3eff:fe6c:f256', 'fe80::d815:4ff:fe9e:89a4']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Sep 30 18:53:31 compute-1 nova_compute[238822]: 2025-09-30 18:53:31.831 2 DEBUG nova.virt.block_device [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Updating existing volume attachment record: 37d5ee81-31cd-4683-9bea-88c9bfaf3999 _volume_attach /usr/lib/python3.12/site-packages/nova/virt/block_device.py:666
Sep 30 18:53:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:32.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:32 compute-1 nova_compute[238822]: 2025-09-30 18:53:32.962 2 DEBUG nova.virt.libvirt.driver [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Attempting to attach volume 67563176-066b-4509-9bae-1c4683a93b12 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2261
Sep 30 18:53:32 compute-1 nova_compute[238822]: 2025-09-30 18:53:32.965 2 DEBUG nova.virt.libvirt.guest [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] attach device xml: <disk type="network" device="disk">
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <alias name="ua-67563176-066b-4509-9bae-1c4683a93b12"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-67563176-066b-4509-9bae-1c4683a93b12">
Sep 30 18:53:32 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   </source>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <auth username="openstack">
Sep 30 18:53:32 compute-1 nova_compute[238822]:     <secret type="ceph" uuid="63d32c6a-fa18-54ed-8711-9a3915cc367b"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   </auth>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:53:32 compute-1 nova_compute[238822]:   <serial>67563176-066b-4509-9bae-1c4683a93b12</serial>
Sep 30 18:53:32 compute-1 nova_compute[238822]: </disk>
Sep 30 18:53:32 compute-1 nova_compute[238822]:  attach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:336
Sep 30 18:53:33 compute-1 ceph-mon[75484]: pgmap v2261: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 75 op/s
Sep 30 18:53:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3151049741' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Sep 30 18:53:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:33 compute-1 nova_compute[238822]: 2025-09-30 18:53:33.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:33 compute-1 podman[308488]: 2025-09-30 18:53:33.574346993 +0000 UTC m=+0.103247863 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Sep 30 18:53:33 compute-1 podman[308489]: 2025-09-30 18:53:33.584848485 +0000 UTC m=+0.101315551 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:53:33 compute-1 podman[308487]: 2025-09-30 18:53:33.59736221 +0000 UTC m=+0.127005379 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:53:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:33 compute-1 sshd-session[308464]: Invalid user template from 49.49.32.245 port 60542
Sep 30 18:53:33 compute-1 sshd-session[308464]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:53:33 compute-1 sshd-session[308464]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:53:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:34.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:34 compute-1 nova_compute[238822]: 2025-09-30 18:53:34.628 2 DEBUG nova.virt.libvirt.driver [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:53:34 compute-1 nova_compute[238822]: 2025-09-30 18:53:34.629 2 DEBUG nova.virt.libvirt.driver [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:53:34 compute-1 nova_compute[238822]: 2025-09-30 18:53:34.630 2 DEBUG nova.virt.libvirt.driver [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 18:53:34 compute-1 nova_compute[238822]: 2025-09-30 18:53:34.630 2 DEBUG nova.virt.libvirt.driver [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] No VIF found with MAC fa:16:3e:6c:f2:56, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 18:53:35 compute-1 ceph-mon[75484]: pgmap v2262: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 77 op/s
Sep 30 18:53:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:35 compute-1 sshd-session[308546]: Invalid user test from 8.243.64.201 port 44120
Sep 30 18:53:35 compute-1 sshd-session[308546]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:53:35 compute-1 sshd-session[308546]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:53:35 compute-1 podman[249638]: time="2025-09-30T18:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:53:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:53:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8832 "" "Go-http-client/1.1"
Sep 30 18:53:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:35.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:36 compute-1 nova_compute[238822]: 2025-09-30 18:53:36.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:36 compute-1 ovn_controller[135204]: 2025-09-30T18:53:36Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:f2:56 10.100.0.12
Sep 30 18:53:36 compute-1 ovn_controller[135204]: 2025-09-30T18:53:36Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:f2:56 10.100.0.12
Sep 30 18:53:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:36.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:36 compute-1 nova_compute[238822]: 2025-09-30 18:53:36.376 2 DEBUG oslo_concurrency.lockutils [None req-d78ebfb9-c50a-432b-a043-0a0e4dda9346 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.567s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:36 compute-1 sshd-session[308464]: Failed password for invalid user template from 49.49.32.245 port 60542 ssh2
Sep 30 18:53:37 compute-1 ceph-mon[75484]: pgmap v2263: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 341 B/s wr, 67 op/s
Sep 30 18:53:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2235796451' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:53:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2235796451' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:53:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:37.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:37 compute-1 sshd-session[308546]: Failed password for invalid user test from 8.243.64.201 port 44120 ssh2
Sep 30 18:53:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:53:38 compute-1 sshd-session[308464]: Received disconnect from 49.49.32.245 port 60542:11: Bye Bye [preauth]
Sep 30 18:53:38 compute-1 sshd-session[308464]: Disconnected from invalid user template 49.49.32.245 port 60542 [preauth]
Sep 30 18:53:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:38.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:38 compute-1 nova_compute[238822]: 2025-09-30 18:53:38.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:39 compute-1 ceph-mon[75484]: pgmap v2264: 353 pgs: 353 active+clean; 88 MiB data, 443 MiB used, 40 GiB / 40 GiB avail; 1.9 MiB/s rd, 341 B/s wr, 67 op/s
Sep 30 18:53:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:39.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:40 compute-1 sshd-session[308546]: Received disconnect from 8.243.64.201 port 44120:11: Bye Bye [preauth]
Sep 30 18:53:40 compute-1 sshd-session[308546]: Disconnected from invalid user test 8.243.64.201 port 44120 [preauth]
Sep 30 18:53:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:53:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:40.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:53:41 compute-1 ceph-mon[75484]: pgmap v2265: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 131 op/s
Sep 30 18:53:41 compute-1 nova_compute[238822]: 2025-09-30 18:53:41.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:41.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:42.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:43 compute-1 ceph-mon[75484]: pgmap v2266: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 309 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:53:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3891315952' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:53:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3891315952' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:53:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2854564858' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:53:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2854564858' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:43 compute-1 nova_compute[238822]: 2025-09-30 18:53:43.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:43.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:43 compute-1 sudo[308557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:53:43 compute-1 sudo[308557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:53:43 compute-1 sudo[308557]: pam_unix(sudo:session): session closed for user root
Sep 30 18:53:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:44.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:45 compute-1 ceph-mon[75484]: pgmap v2267: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 309 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Sep 30 18:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:45.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:46 compute-1 nova_compute[238822]: 2025-09-30 18:53:46.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:46.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:46 compute-1 unix_chkpwd[308587]: password check failed for user (root)
Sep 30 18:53:46 compute-1 sshd-session[308583]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:53:47 compute-1 ceph-mon[75484]: pgmap v2268: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:47.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:48.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:48 compute-1 nova_compute[238822]: 2025-09-30 18:53:48.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:49 compute-1 sshd-session[308583]: Failed password for root from 192.210.160.141 port 39368 ssh2
Sep 30 18:53:49 compute-1 ceph-mon[75484]: pgmap v2269: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:53:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: ERROR   18:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: ERROR   18:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: ERROR   18:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: ERROR   18:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: ERROR   18:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:53:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:53:49 compute-1 sshd-session[308583]: Connection closed by authenticating user root 192.210.160.141 port 39368 [preauth]
Sep 30 18:53:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:49.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:50.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:51 compute-1 nova_compute[238822]: 2025-09-30 18:53:51.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:51 compute-1 ceph-mon[75484]: pgmap v2270: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Sep 30 18:53:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:52 compute-1 ovn_controller[135204]: 2025-09-30T18:53:52Z|00330|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 18:53:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:52.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:53 compute-1 ceph-mon[75484]: pgmap v2271: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 12 KiB/s wr, 0 op/s
Sep 30 18:53:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:53:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:53 compute-1 nova_compute[238822]: 2025-09-30 18:53:53.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:53.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:54.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:54.442 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:54.442 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:53:54.443 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:53:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:55 compute-1 ceph-mon[75484]: pgmap v2272: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 12 KiB/s wr, 1 op/s
Sep 30 18:53:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:55 compute-1 nova_compute[238822]: 2025-09-30 18:53:55.490 2 DEBUG oslo_concurrency.lockutils [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:53:55 compute-1 nova_compute[238822]: 2025-09-30 18:53:55.492 2 DEBUG oslo_concurrency.lockutils [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:53:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:55.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.000 2 DEBUG nova.objects.instance [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'flavor' on Instance uuid 3daba7c9-ccac-4d03-a63b-2f978730a440 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:53:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:56.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.520 2 INFO nova.compute.manager [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Detaching volume 67563176-066b-4509-9bae-1c4683a93b12
Sep 30 18:53:56 compute-1 podman[308600]: 2025-09-30 18:53:56.566864053 +0000 UTC m=+0.095278629 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:53:56 compute-1 podman[308599]: 2025-09-30 18:53:56.621108249 +0000 UTC m=+0.156432580 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.629 2 INFO nova.virt.block_device [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Attempting to driver detach volume 67563176-066b-4509-9bae-1c4683a93b12 from mountpoint /dev/vdb
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.644 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-67563176-066b-4509-9bae-1c4683a93b12 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.648 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-67563176-066b-4509-9bae-1c4683a93b12 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.648 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Attempting to detach device vdb from instance 3daba7c9-ccac-4d03-a63b-2f978730a440 from the persistent domain config. _detach_from_persistent /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2576
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.649 2 DEBUG nova.virt.libvirt.guest [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] detach device xml: <disk type="network" device="disk">
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <alias name="ua-67563176-066b-4509-9bae-1c4683a93b12"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-67563176-066b-4509-9bae-1c4683a93b12">
Sep 30 18:53:56 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   </source>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <serial>67563176-066b-4509-9bae-1c4683a93b12</serial>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]: </disk>
Sep 30 18:53:56 compute-1 nova_compute[238822]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.662 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Found disk vdb by alias ua-67563176-066b-4509-9bae-1c4683a93b12 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.662 2 WARNING nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Failed to detach device vdb from instance 3daba7c9-ccac-4d03-a63b-2f978730a440 from the persistent domain config. Libvirt did not report any error but the device is still in the config.
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.662 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] (1/8): Attempting to detach device vdb with device alias ua-67563176-066b-4509-9bae-1c4683a93b12 from instance 3daba7c9-ccac-4d03-a63b-2f978730a440 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2612
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.663 2 DEBUG nova.virt.libvirt.guest [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] detach device xml: <disk type="network" device="disk">
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <alias name="ua-67563176-066b-4509-9bae-1c4683a93b12"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <source protocol="rbd" name="volumes/volume-67563176-066b-4509-9bae-1c4683a93b12">
Sep 30 18:53:56 compute-1 nova_compute[238822]:     <host name="192.168.122.100" port="6789"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:     <host name="192.168.122.101" port="6789"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   </source>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <target dev="vdb" bus="virtio"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <serial>67563176-066b-4509-9bae-1c4683a93b12</serial>
Sep 30 18:53:56 compute-1 nova_compute[238822]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Sep 30 18:53:56 compute-1 nova_compute[238822]: </disk>
Sep 30 18:53:56 compute-1 nova_compute[238822]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Sep 30 18:53:56 compute-1 nova_compute[238822]: 2025-09-30 18:53:56.804 2 DEBUG nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Start waiting for the detach event from libvirt for device vdb with device alias ua-67563176-066b-4509-9bae-1c4683a93b12 for instance 3daba7c9-ccac-4d03-a63b-2f978730a440 _detach_from_live_and_wait_for_event /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2688
Sep 30 18:53:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:57 compute-1 ceph-mon[75484]: pgmap v2273: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:53:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:57.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2282182362' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:53:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2282182362' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:53:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:53:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:53:58.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:53:58 compute-1 podman[308657]: 2025-09-30 18:53:58.560000408 +0000 UTC m=+0.093434618 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 18:53:58 compute-1 nova_compute[238822]: 2025-09-30 18:53:58.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:53:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:53:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:53:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:53:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:53:59 compute-1 ceph-mon[75484]: pgmap v2274: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:53:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:53:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:53:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:53:59.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:00.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:01 compute-1 nova_compute[238822]: 2025-09-30 18:54:01.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:01 compute-1 ceph-mon[75484]: pgmap v2275: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:54:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:01.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:02.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.575 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.575 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:54:02 compute-1 nova_compute[238822]: 2025-09-30 18:54:02.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:54:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:54:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/335007788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:03 compute-1 nova_compute[238822]: 2025-09-30 18:54:03.052 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:54:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:03 compute-1 ceph-mon[75484]: pgmap v2276: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:54:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/335007788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:03 compute-1 nova_compute[238822]: 2025-09-30 18:54:03.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:03.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:04 compute-1 sudo[308704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.119 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000029 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.121 2 DEBUG nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] skipping disk for instance-00000029 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12136
Sep 30 18:54:04 compute-1 sudo[308704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:04 compute-1 sudo[308704]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:04 compute-1 podman[308730]: 2025-09-30 18:54:04.284563207 +0000 UTC m=+0.124335409 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 18:54:04 compute-1 podman[308729]: 2025-09-30 18:54:04.287880406 +0000 UTC m=+0.136772292 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Sep 30 18:54:04 compute-1 podman[308728]: 2025-09-30 18:54:04.307734929 +0000 UTC m=+0.161610229 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:54:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:04.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.420 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.422 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.453 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.454 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4439MB free_disk=39.94666290283203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.455 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:04 compute-1 nova_compute[238822]: 2025-09-30 18:54:04.455 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:05 compute-1 ceph-mon[75484]: pgmap v2277: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:54:05 compute-1 nova_compute[238822]: 2025-09-30 18:54:05.537 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Instance 3daba7c9-ccac-4d03-a63b-2f978730a440 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 18:54:05 compute-1 nova_compute[238822]: 2025-09-30 18:54:05.538 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:54:05 compute-1 nova_compute[238822]: 2025-09-30 18:54:05.538 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=39GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:54:04 up  4:31,  0 user,  load average: 0.79, 0.48, 0.48\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_127ca83529de45efa0a76aa8ceefcd3d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:54:05 compute-1 nova_compute[238822]: 2025-09-30 18:54:05.604 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:54:05 compute-1 podman[249638]: time="2025-09-30T18:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:54:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 37993 "" "Go-http-client/1.1"
Sep 30 18:54:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8829 "" "Go-http-client/1.1"
Sep 30 18:54:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:05.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:54:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/509061425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:06 compute-1 nova_compute[238822]: 2025-09-30 18:54:06.085 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:54:06 compute-1 nova_compute[238822]: 2025-09-30 18:54:06.094 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:54:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:06 compute-1 nova_compute[238822]: 2025-09-30 18:54:06.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:06.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1895805963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/509061425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:06 compute-1 nova_compute[238822]: 2025-09-30 18:54:06.602 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:54:07 compute-1 nova_compute[238822]: 2025-09-30 18:54:07.121 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:54:07 compute-1 nova_compute[238822]: 2025-09-30 18:54:07.122 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.667s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:07 compute-1 ceph-mon[75484]: pgmap v2278: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:54:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/20938895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:54:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:07.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:08 compute-1 nova_compute[238822]: 2025-09-30 18:54:08.119 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:08.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:08 compute-1 nova_compute[238822]: 2025-09-30 18:54:08.634 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:08 compute-1 nova_compute[238822]: 2025-09-30 18:54:08.635 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:08 compute-1 nova_compute[238822]: 2025-09-30 18:54:08.635 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:08 compute-1 nova_compute[238822]: 2025-09-30 18:54:08.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:09 compute-1 ceph-mon[75484]: pgmap v2279: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 1023 B/s wr, 0 op/s
Sep 30 18:54:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:09.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:10.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:11 compute-1 sudo[308821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:54:11 compute-1 nova_compute[238822]: 2025-09-30 18:54:11.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:11 compute-1 sudo[308821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:11 compute-1 sudo[308821]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:11 compute-1 ceph-mon[75484]: pgmap v2280: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 3.3 KiB/s wr, 1 op/s
Sep 30 18:54:11 compute-1 sudo[308846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:54:11 compute-1 sudo[308846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:11.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:12 compute-1 sudo[308846]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:54:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:12.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:54:12 compute-1 nova_compute[238822]: 2025-09-30 18:54:12.568 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:13 compute-1 ceph-mon[75484]: pgmap v2281: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 2.3 KiB/s wr, 0 op/s
Sep 30 18:54:13 compute-1 nova_compute[238822]: 2025-09-30 18:54:13.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:13.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:14 compute-1 nova_compute[238822]: 2025-09-30 18:54:14.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:14 compute-1 unix_chkpwd[308905]: password check failed for user (root)
Sep 30 18:54:14 compute-1 sshd-session[308901]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:54:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:14.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:15 compute-1 ceph-mon[75484]: pgmap v2282: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:54:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:15.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:16 compute-1 sshd-session[308901]: Failed password for root from 192.210.160.141 port 48336 ssh2
Sep 30 18:54:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:16 compute-1 nova_compute[238822]: 2025-09-30 18:54:16.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:16.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:54:16 compute-1 ceph-mon[75484]: pgmap v2283: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 515 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:54:16 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:54:16 compute-1 nova_compute[238822]: 2025-09-30 18:54:16.807 2 WARNING nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Waiting for libvirt event about the detach of device vdb with device alias ua-67563176-066b-4509-9bae-1c4683a93b12 from instance 3daba7c9-ccac-4d03-a63b-2f978730a440 is timed out.
Sep 30 18:54:16 compute-1 nova_compute[238822]: 2025-09-30 18:54:16.819 2 INFO nova.virt.libvirt.driver [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully detached device vdb from instance 3daba7c9-ccac-4d03-a63b-2f978730a440 from the live domain config.
Sep 30 18:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:17 compute-1 sshd-session[308901]: Connection closed by authenticating user root 192.210.160.141 port 48336 [preauth]
Sep 30 18:54:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:17.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:17 compute-1 ceph-mon[75484]: pgmap v2284: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 515 B/s rd, 2.3 KiB/s wr, 1 op/s
Sep 30 18:54:18 compute-1 nova_compute[238822]: 2025-09-30 18:54:18.052 2 DEBUG oslo_concurrency.lockutils [None req-aa8e7404-e9ec-4d3f-8d09-53ca149b3d17 e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 22.560s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:18.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:18 compute-1 nova_compute[238822]: 2025-09-30 18:54:18.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1344533578' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:54:19 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1344533578' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: ERROR   18:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: ERROR   18:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: ERROR   18:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: ERROR   18:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: ERROR   18:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:54:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:54:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:19.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:20 compute-1 ceph-mon[75484]: pgmap v2285: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 2.4 KiB/s rd, 2.3 KiB/s wr, 3 op/s
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.178 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.178 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.179 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.179 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.179 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.194 2 INFO nova.compute.manager [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Terminating instance
Sep 30 18:54:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:20.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.714 2 DEBUG nova.compute.manager [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 18:54:20 compute-1 kernel: tapd41448fe-ba (unregistering): left promiscuous mode
Sep 30 18:54:20 compute-1 NetworkManager[45549]: <info>  [1759258460.7773] device (tapd41448fe-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 18:54:20 compute-1 ovn_controller[135204]: 2025-09-30T18:54:20Z|00331|binding|INFO|Releasing lport d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c from this chassis (sb_readonly=0)
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:20 compute-1 ovn_controller[135204]: 2025-09-30T18:54:20Z|00332|binding|INFO|Setting lport d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c down in Southbound
Sep 30 18:54:20 compute-1 ovn_controller[135204]: 2025-09-30T18:54:20Z|00333|binding|INFO|Removing iface tapd41448fe-ba ovn-installed in OVS
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:20.805 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:f2:56 10.100.0.12'], port_security=['fa:16:3e:6c:f2:56 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3daba7c9-ccac-4d03-a63b-2f978730a440', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '127ca83529de45efa0a76aa8ceefcd3d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '126a2d65-c072-4128-836f-db6080f798dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3bc33b-b1e3-4a2f-8784-2e8238744730, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>], logical_port=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9c16f77350>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:54:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:20.806 144543 INFO neutron.agent.ovn.metadata.agent [-] Port d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c in datapath cee64377-b6b9-46f2-8d77-c7978d4cc7a0 unbound from our chassis
Sep 30 18:54:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:20.808 144543 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee64377-b6b9-46f2-8d77-c7978d4cc7a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 18:54:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:20.811 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9f6f04-cae4-40fd-92d3-d0ea21e580e7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:20 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:20.812 144543 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 namespace which is not needed anymore
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:20 compute-1 sudo[308912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:54:20 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000029.scope: Deactivated successfully.
Sep 30 18:54:20 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000029.scope: Consumed 16.279s CPU time.
Sep 30 18:54:20 compute-1 sudo[308912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:20 compute-1 systemd-machined[195911]: Machine qemu-31-instance-00000029 terminated.
Sep 30 18:54:20 compute-1 sudo[308912]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.967 2 INFO nova.virt.libvirt.driver [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Instance destroyed successfully.
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.968 2 DEBUG nova.objects.instance [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lazy-loading 'resources' on Instance uuid 3daba7c9-ccac-4d03-a63b-2f978730a440 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.998 2 DEBUG nova.compute.manager [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:54:20 compute-1 nova_compute[238822]: 2025-09-30 18:54:20.999 2 DEBUG oslo_concurrency.lockutils [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.000 2 DEBUG oslo_concurrency.lockutils [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.000 2 DEBUG oslo_concurrency.lockutils [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.001 2 DEBUG nova.compute.manager [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] No waiting events found dispatching network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.002 2 DEBUG nova.compute.manager [req-7dca92e4-aab7-4b19-a498-8ccf2e24bd10 req-b3ab247f-f325-4650-be36-31f1a43ad1d4 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:54:21 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [NOTICE]   (308343) : haproxy version is 3.0.5-8e879a5
Sep 30 18:54:21 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [NOTICE]   (308343) : path to executable is /usr/sbin/haproxy
Sep 30 18:54:21 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [WARNING]  (308343) : Exiting Master process...
Sep 30 18:54:21 compute-1 podman[308964]: 2025-09-30 18:54:21.030201728 +0000 UTC m=+0.066257829 container kill 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:54:21 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [ALERT]    (308343) : Current worker (308345) exited with code 143 (Terminated)
Sep 30 18:54:21 compute-1 neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0[308338]: [WARNING]  (308343) : All workers exited. Exiting... (0)
Sep 30 18:54:21 compute-1 systemd[1]: libpod-8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9.scope: Deactivated successfully.
Sep 30 18:54:21 compute-1 podman[308986]: 2025-09-30 18:54:21.110533145 +0000 UTC m=+0.047774184 container died 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 18:54:21 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9-userdata-shm.mount: Deactivated successfully.
Sep 30 18:54:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-7c5e62a4517a75e3f2e289792d31a2fcf5e0600b9ea6c7517c613110e9515175-merged.mount: Deactivated successfully.
Sep 30 18:54:21 compute-1 podman[308986]: 2025-09-30 18:54:21.165953182 +0000 UTC m=+0.103194201 container cleanup 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:54:21 compute-1 systemd[1]: libpod-conmon-8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9.scope: Deactivated successfully.
Sep 30 18:54:21 compute-1 podman[308988]: 2025-09-30 18:54:21.19381165 +0000 UTC m=+0.117716951 container remove 8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.203 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1dbadf-19ac-4f02-aae9-b5576fb21541]: (4, ("Tue Sep 30 06:54:20 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 (8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9)\n8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9\nTue Sep 30 06:54:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 (8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9)\n8da25400538e5ba4690a445dc41d3cbcc6c477a43fda175a334d03db6b3efea9\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.205 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[87fd0ff1-6380-47dc-823f-6b856465c8d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.206 144543 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee64377-b6b9-46f2-8d77-c7978d4cc7a0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.207 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[3e943709-2d7d-43e7-a72c-487ad0869c53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.208 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee64377-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 kernel: tapcee64377-b0: left promiscuous mode
Sep 30 18:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.282 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[a11ea3d2-9d8e-4353-a713-94e40c5e1749]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.313 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[258b8721-f31a-4065-9d85-78ec8a2d5e1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.314 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[53001335-b77e-4930-b5bd-4c5029275418]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.331 262759 DEBUG oslo.privsep.daemon [-] privsep: reply[eee4d0e4-b048-47c5-a812-2e78c11f07fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1624380, 'reachable_time': 41361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309023, 'error': None, 'target': 'ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 systemd[1]: run-netns-ovnmeta\x2dcee64377\x2db6b9\x2d46f2\x2d8d77\x2dc7978d4cc7a0.mount: Deactivated successfully.
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.340 144666 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cee64377-b6b9-46f2-8d77-c7978d4cc7a0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.342 144666 DEBUG oslo.privsep.daemon [-] privsep: reply[66c6a964-34a0-4e96-8b84-e2f3a72ebe50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.388 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:21.389 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.476 2 DEBUG nova.virt.libvirt.vif [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T18:53:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1848850950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-184885095',id=41,image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T18:53:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='127ca83529de45efa0a76aa8ceefcd3d',ramdisk_id='',reservation_id='r-tckvj671',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='5b99cbca-b655-4be5-8343-cf504005c42e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-1619382540-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T18:53:24Z,user_data=None,user_id='e80b7fccb5a34c13b356857340eff1ee',uuid=3daba7c9-ccac-4d03-a63b-2f978730a440,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.477 2 DEBUG nova.network.os_vif_util [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converting VIF {"id": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "address": "fa:16:3e:6c:f2:56", "network": {"id": "cee64377-b6b9-46f2-8d77-c7978d4cc7a0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-270339558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2790c8e9fb6a48debd443ac79e2e12ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd41448fe-ba", "ovs_interfaceid": "d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.478 2 DEBUG nova.network.os_vif_util [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.478 2 DEBUG os_vif [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.481 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd41448fe-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4aa33ddf-ea57-447b-857d-1403831ce0f0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.494 2 INFO os_vif [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f2:56,bridge_name='br-int',has_traffic_filtering=True,id=d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c,network=Network(cee64377-b6b9-46f2-8d77-c7978d4cc7a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd41448fe-ba')
Sep 30 18:54:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:54:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:21.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.967 2 INFO nova.virt.libvirt.driver [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Deleting instance files /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440_del
Sep 30 18:54:21 compute-1 nova_compute[238822]: 2025-09-30 18:54:21.968 2 INFO nova.virt.libvirt.driver [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Deletion of /var/lib/nova/instances/3daba7c9-ccac-4d03-a63b-2f978730a440_del complete
Sep 30 18:54:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:22.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:22 compute-1 nova_compute[238822]: 2025-09-30 18:54:22.485 2 INFO nova.compute.manager [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Took 1.77 seconds to destroy the instance on the hypervisor.
Sep 30 18:54:22 compute-1 nova_compute[238822]: 2025-09-30 18:54:22.485 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 18:54:22 compute-1 nova_compute[238822]: 2025-09-30 18:54:22.486 2 DEBUG nova.compute.manager [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 18:54:22 compute-1 nova_compute[238822]: 2025-09-30 18:54:22.486 2 DEBUG nova.network.neutron [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 18:54:22 compute-1 nova_compute[238822]: 2025-09-30 18:54:22.487 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:54:22 compute-1 ceph-mon[75484]: pgmap v2286: 353 pgs: 353 active+clean; 121 MiB data, 452 MiB used, 40 GiB / 40 GiB avail; 2.2 KiB/s rd, 2 op/s
Sep 30 18:54:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.064 2 DEBUG nova.compute.manager [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.065 2 DEBUG oslo_concurrency.lockutils [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Acquiring lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.065 2 DEBUG oslo_concurrency.lockutils [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.065 2 DEBUG oslo_concurrency.lockutils [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.066 2 DEBUG nova.compute.manager [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] No waiting events found dispatching network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.066 2 DEBUG nova.compute.manager [req-7b8a230e-4a2b-4ac4-975d-7cfbe0dd27f2 req-d67432b5-da8f-4663-bb80-693c3f32669b 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-unplugged-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 18:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:23 compute-1 sshd-session[309045]: Invalid user sol from 45.148.10.240 port 33936
Sep 30 18:54:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:23 compute-1 nova_compute[238822]: 2025-09-30 18:54:23.318 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 18:54:23 compute-1 sshd-session[309045]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:23 compute-1 sshd-session[309045]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.148.10.240
Sep 30 18:54:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:23.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:23 compute-1 ceph-mon[75484]: pgmap v2287: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 42 op/s
Sep 30 18:54:24 compute-1 nova_compute[238822]: 2025-09-30 18:54:24.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:54:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:24 compute-1 sudo[309048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:54:24 compute-1 sudo[309048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:24 compute-1 sudo[309048]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:24.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:24 compute-1 nova_compute[238822]: 2025-09-30 18:54:24.865 2 DEBUG nova.network.neutron [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 18:54:25 compute-1 nova_compute[238822]: 2025-09-30 18:54:25.128 2 DEBUG nova.compute.manager [req-c889b623-b659-4c8a-bd00-ceaefa8271ce req-8138ddf7-e5bc-4591-94be-9edb839d4970 44d5b77b33de420994797a8a6d5d109d faf843bff6f4482db6a9e1e7c4cd2a24 - - default default] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Received event network-vif-deleted-d41448fe-bacd-4dda-bdd5-1ea7f20c9c7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 18:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:25 compute-1 nova_compute[238822]: 2025-09-30 18:54:25.374 2 INFO nova.compute.manager [-] [instance: 3daba7c9-ccac-4d03-a63b-2f978730a440] Took 2.89 seconds to deallocate network for instance.
Sep 30 18:54:25 compute-1 sshd-session[309045]: Failed password for invalid user sol from 45.148.10.240 port 33936 ssh2
Sep 30 18:54:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:25.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:25 compute-1 nova_compute[238822]: 2025-09-30 18:54:25.900 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:25 compute-1 nova_compute[238822]: 2025-09-30 18:54:25.901 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:25 compute-1 ceph-mon[75484]: pgmap v2288: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:54:25 compute-1 nova_compute[238822]: 2025-09-30 18:54:25.962 2 DEBUG oslo_concurrency.processutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:54:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:26 compute-1 nova_compute[238822]: 2025-09-30 18:54:26.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:26.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:26 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:54:26 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1484437556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:26 compute-1 nova_compute[238822]: 2025-09-30 18:54:26.474 2 DEBUG oslo_concurrency.processutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:54:26 compute-1 nova_compute[238822]: 2025-09-30 18:54:26.481 2 DEBUG nova.compute.provider_tree [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:54:26 compute-1 nova_compute[238822]: 2025-09-30 18:54:26.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:26 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1484437556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:54:26 compute-1 nova_compute[238822]: 2025-09-30 18:54:26.989 2 DEBUG nova.scheduler.client.report [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:27 compute-1 nova_compute[238822]: 2025-09-30 18:54:27.502 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.601s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:27 compute-1 nova_compute[238822]: 2025-09-30 18:54:27.558 2 INFO nova.scheduler.client.report [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Deleted allocations for instance 3daba7c9-ccac-4d03-a63b-2f978730a440
Sep 30 18:54:27 compute-1 podman[309100]: 2025-09-30 18:54:27.57270084 +0000 UTC m=+0.100879779 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:54:27 compute-1 podman[309099]: 2025-09-30 18:54:27.640999603 +0000 UTC m=+0.168846903 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 18:54:27 compute-1 sshd-session[309045]: Connection closed by invalid user sol 45.148.10.240 port 33936 [preauth]
Sep 30 18:54:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:27.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:27 compute-1 ceph-mon[75484]: pgmap v2289: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:54:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:28.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:28 compute-1 nova_compute[238822]: 2025-09-30 18:54:28.760 2 DEBUG oslo_concurrency.lockutils [None req-6c5e0bf6-e026-42cc-9dd0-21fb95ac332c e80b7fccb5a34c13b356857340eff1ee 127ca83529de45efa0a76aa8ceefcd3d - - default default] Lock "3daba7c9-ccac-4d03-a63b-2f978730a440" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.582s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:29 compute-1 podman[309148]: 2025-09-30 18:54:29.549049226 +0000 UTC m=+0.085330402 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:54:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3446365784' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:54:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3446365784' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:54:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:29.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:30 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:30.391 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 18:54:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:30.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:30 compute-1 ceph-mon[75484]: pgmap v2290: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 30 KiB/s rd, 1.4 KiB/s wr, 41 op/s
Sep 30 18:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:31 compute-1 nova_compute[238822]: 2025-09-30 18:54:31.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:31 compute-1 nova_compute[238822]: 2025-09-30 18:54:31.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:31.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:31 compute-1 ceph-mon[75484]: pgmap v2291: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 28 KiB/s rd, 1.4 KiB/s wr, 39 op/s
Sep 30 18:54:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:32.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:33 compute-1 nova_compute[238822]: 2025-09-30 18:54:33.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:33.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:33 compute-1 ceph-mon[75484]: pgmap v2292: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 38 KiB/s rd, 1.7 KiB/s wr, 53 op/s
Sep 30 18:54:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:34.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:34 compute-1 podman[309172]: 2025-09-30 18:54:34.547146035 +0000 UTC m=+0.085749793 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=iscsid, org.label-schema.schema-version=1.0)
Sep 30 18:54:34 compute-1 podman[309174]: 2025-09-30 18:54:34.568183579 +0000 UTC m=+0.093292775 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd)
Sep 30 18:54:34 compute-1 podman[309173]: 2025-09-30 18:54:34.575567337 +0000 UTC m=+0.105147453 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:35 compute-1 podman[249638]: time="2025-09-30T18:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:54:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:54:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:54:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:35.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:35 compute-1 ceph-mon[75484]: pgmap v2293: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 10 KiB/s rd, 255 B/s wr, 13 op/s
Sep 30 18:54:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:36 compute-1 nova_compute[238822]: 2025-09-30 18:54:36.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:36.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:36 compute-1 nova_compute[238822]: 2025-09-30 18:54:36.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3518998138' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:54:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3518998138' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:37 compute-1 sshd-session[309232]: Invalid user geoserver from 161.132.50.17 port 58722
Sep 30 18:54:37 compute-1 sshd-session[309232]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:37 compute-1 sshd-session[309232]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:54:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:37.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:54:37 compute-1 ceph-mon[75484]: pgmap v2294: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 10 KiB/s rd, 255 B/s wr, 13 op/s
Sep 30 18:54:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:54:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:38.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:54:38 compute-1 sshd-session[309232]: Failed password for invalid user geoserver from 161.132.50.17 port 58722 ssh2
Sep 30 18:54:39 compute-1 sshd-session[309232]: Received disconnect from 161.132.50.17 port 58722:11: Bye Bye [preauth]
Sep 30 18:54:39 compute-1 sshd-session[309232]: Disconnected from invalid user geoserver 161.132.50.17 port 58722 [preauth]
Sep 30 18:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:39.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:39 compute-1 ceph-mon[75484]: pgmap v2295: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 255 B/s wr, 14 op/s
Sep 30 18:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:40.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:41 compute-1 nova_compute[238822]: 2025-09-30 18:54:41.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:41 compute-1 nova_compute[238822]: 2025-09-30 18:54:41.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:41 compute-1 sshd-session[309237]: Invalid user ahmed from 192.210.160.141 port 46664
Sep 30 18:54:41 compute-1 sshd-session[309237]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:41 compute-1 sshd-session[309237]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:54:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:41.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:41 compute-1 ceph-mon[75484]: pgmap v2296: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 10 KiB/s rd, 255 B/s wr, 13 op/s
Sep 30 18:54:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:42.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:42 compute-1 sshd-session[309241]: Invalid user myuser from 49.49.32.245 port 55736
Sep 30 18:54:42 compute-1 sshd-session[309241]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:42 compute-1 sshd-session[309241]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:43 compute-1 sshd[170789]: Timeout before authentication for connection from 14.103.105.56 to 38.102.83.102, pid = 307380
Sep 30 18:54:43 compute-1 sshd-session[309237]: Failed password for invalid user ahmed from 192.210.160.141 port 46664 ssh2
Sep 30 18:54:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:43.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:43 compute-1 ceph-mon[75484]: pgmap v2297: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 11 KiB/s rd, 255 B/s wr, 14 op/s
Sep 30 18:54:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:44 compute-1 sudo[309246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:54:44 compute-1 sudo[309246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:54:44 compute-1 sudo[309246]: pam_unix(sudo:session): session closed for user root
Sep 30 18:54:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:44.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:44 compute-1 sshd-session[309241]: Failed password for invalid user myuser from 49.49.32.245 port 55736 ssh2
Sep 30 18:54:45 compute-1 sshd-session[309271]: Invalid user hehe from 8.243.64.201 port 33452
Sep 30 18:54:45 compute-1 sshd-session[309271]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:45 compute-1 sshd-session[309271]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:45 compute-1 sshd-session[309241]: Received disconnect from 49.49.32.245 port 55736:11: Bye Bye [preauth]
Sep 30 18:54:45 compute-1 sshd-session[309241]: Disconnected from invalid user myuser 49.49.32.245 port 55736 [preauth]
Sep 30 18:54:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:45.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:45 compute-1 ceph-mon[75484]: pgmap v2298: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:54:46 compute-1 sshd-session[309237]: Connection closed by invalid user ahmed 192.210.160.141 port 46664 [preauth]
Sep 30 18:54:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:54:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:46.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:54:46 compute-1 nova_compute[238822]: 2025-09-30 18:54:46.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:46 compute-1 nova_compute[238822]: 2025-09-30 18:54:46.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:47 compute-1 sshd-session[309271]: Failed password for invalid user hehe from 8.243.64.201 port 33452 ssh2
Sep 30 18:54:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:47.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:47 compute-1 ceph-mon[75484]: pgmap v2299: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.945395) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258487945503, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 1206, "num_deletes": 251, "total_data_size": 2740373, "memory_usage": 2778560, "flush_reason": "Manual Compaction"}
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258487961462, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 1776786, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 61340, "largest_seqno": 62541, "table_properties": {"data_size": 1771557, "index_size": 2688, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11675, "raw_average_key_size": 19, "raw_value_size": 1760943, "raw_average_value_size": 3015, "num_data_blocks": 119, "num_entries": 584, "num_filter_entries": 584, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258391, "oldest_key_time": 1759258391, "file_creation_time": 1759258487, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 16424 microseconds, and 9895 cpu microseconds.
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.961828) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 1776786 bytes OK
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.961919) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.963865) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.963886) EVENT_LOG_v1 {"time_micros": 1759258487963879, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.963912) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 2734536, prev total WAL file size 2734536, number of live WAL files 2.
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.965772) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(1735KB)], [126(10MB)]
Sep 30 18:54:47 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258487965822, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 13274145, "oldest_snapshot_seqno": -1}
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 7856 keys, 11261240 bytes, temperature: kUnknown
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258488033273, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 11261240, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11215414, "index_size": 25105, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19653, "raw_key_size": 208644, "raw_average_key_size": 26, "raw_value_size": 11081559, "raw_average_value_size": 1410, "num_data_blocks": 964, "num_entries": 7856, "num_filter_entries": 7856, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258487, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.034088) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 11261240 bytes
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.035773) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.2 rd, 165.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 11.0 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(13.8) write-amplify(6.3) OK, records in: 8372, records dropped: 516 output_compression: NoCompression
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.035805) EVENT_LOG_v1 {"time_micros": 1759258488035790, "job": 80, "event": "compaction_finished", "compaction_time_micros": 67988, "compaction_time_cpu_micros": 50477, "output_level": 6, "num_output_files": 1, "total_output_size": 11261240, "num_input_records": 8372, "num_output_records": 7856, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258488036848, "job": 80, "event": "table_file_deletion", "file_number": 128}
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258488041106, "job": 80, "event": "table_file_deletion", "file_number": 126}
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:47.965585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.041226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.041237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.041240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.041244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:54:48.041249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:54:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:48 compute-1 sshd-session[309276]: Invalid user hacluster from 185.156.73.233 port 24826
Sep 30 18:54:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:48.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:48 compute-1 sshd-session[309271]: Received disconnect from 8.243.64.201 port 33452:11: Bye Bye [preauth]
Sep 30 18:54:48 compute-1 sshd-session[309271]: Disconnected from invalid user hehe 8.243.64.201 port 33452 [preauth]
Sep 30 18:54:48 compute-1 sshd-session[309276]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:54:48 compute-1 sshd-session[309276]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 18:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:49 compute-1 unix_chkpwd[309282]: password check failed for user (root)
Sep 30 18:54:49 compute-1 sshd-session[309278]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 18:54:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: ERROR   18:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: ERROR   18:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: ERROR   18:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: ERROR   18:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: ERROR   18:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:54:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:54:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:49.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:49 compute-1 ceph-mon[75484]: pgmap v2300: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:54:50 compute-1 sshd-session[309276]: Failed password for invalid user hacluster from 185.156.73.233 port 24826 ssh2
Sep 30 18:54:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:50.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:50 compute-1 sshd-session[309276]: Connection closed by invalid user hacluster 185.156.73.233 port 24826 [preauth]
Sep 30 18:54:50 compute-1 sshd-session[309278]: Failed password for root from 103.153.190.105 port 57480 ssh2
Sep 30 18:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:51 compute-1 nova_compute[238822]: 2025-09-30 18:54:51.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:51 compute-1 nova_compute[238822]: 2025-09-30 18:54:51.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:52 compute-1 ceph-mon[75484]: pgmap v2301: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:54:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:52 compute-1 sshd-session[309278]: Received disconnect from 103.153.190.105 port 57480:11: Bye Bye [preauth]
Sep 30 18:54:52 compute-1 sshd-session[309278]: Disconnected from authenticating user root 103.153.190.105 port 57480 [preauth]
Sep 30 18:54:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:54:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:52.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:54:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:54:53 compute-1 sshd[170789]: drop connection #0 from [14.103.105.56]:64610 on [38.102.83.102]:22 penalty: exceeded LoginGraceTime
Sep 30 18:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:53.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:54 compute-1 ceph-mon[75484]: pgmap v2302: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:54:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 18:54:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:54.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 18:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:54.444 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:54.445 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:54:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:54:54.445 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:55 compute-1 ceph-mon[75484]: pgmap v2303: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:54:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:55.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:54:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:56.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:54:56 compute-1 nova_compute[238822]: 2025-09-30 18:54:56.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:56 compute-1 nova_compute[238822]: 2025-09-30 18:54:56.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3136854427' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:54:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3136854427' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:54:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:57.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:54:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:54:58.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:54:58 compute-1 podman[309295]: 2025-09-30 18:54:58.553105973 +0000 UTC m=+0.087413988 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:54:58 compute-1 podman[309294]: 2025-09-30 18:54:58.591809231 +0000 UTC m=+0.130580465 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 18:54:58 compute-1 ceph-mon[75484]: pgmap v2304: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:54:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:54:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:54:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:54:59 compute-1 ceph-mon[75484]: pgmap v2305: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:54:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:54:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:54:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:54:59.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:00.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:00 compute-1 podman[309348]: 2025-09-30 18:55:00.546014071 +0000 UTC m=+0.086108532 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:01 compute-1 nova_compute[238822]: 2025-09-30 18:55:01.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:01 compute-1 nova_compute[238822]: 2025-09-30 18:55:01.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:01 compute-1 ceph-mon[75484]: pgmap v2306: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:01.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:02.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.571 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.572 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.572 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:55:02 compute-1 nova_compute[238822]: 2025-09-30 18:55:02.572 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:55:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:55:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2940139615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.054 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:55:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2940139615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.304 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.306 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:55:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.347 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.348 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4604MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.349 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:55:03 compute-1 nova_compute[238822]: 2025-09-30 18:55:03.349 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:55:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:03.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:04 compute-1 ceph-mon[75484]: pgmap v2307: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:04 compute-1 nova_compute[238822]: 2025-09-30 18:55:04.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:55:04 compute-1 nova_compute[238822]: 2025-09-30 18:55:04.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:55:03 up  4:32,  0 user,  load average: 0.38, 0.42, 0.46\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:55:04 compute-1 nova_compute[238822]: 2025-09-30 18:55:04.439 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:55:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:04.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:04 compute-1 sudo[309396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:55:04 compute-1 sudo[309396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:04 compute-1 sudo[309396]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:55:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2993550184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:04 compute-1 nova_compute[238822]: 2025-09-30 18:55:04.883 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:55:04 compute-1 nova_compute[238822]: 2025-09-30 18:55:04.894 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:55:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2993550184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:05 compute-1 nova_compute[238822]: 2025-09-30 18:55:05.405 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:55:05 compute-1 podman[309445]: 2025-09-30 18:55:05.557154331 +0000 UTC m=+0.088270181 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Sep 30 18:55:05 compute-1 podman[309444]: 2025-09-30 18:55:05.557369546 +0000 UTC m=+0.093518041 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:55:05 compute-1 podman[309446]: 2025-09-30 18:55:05.588001398 +0000 UTC m=+0.118701446 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:55:05 compute-1 podman[249638]: time="2025-09-30T18:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:55:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:55:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8374 "" "Go-http-client/1.1"
Sep 30 18:55:05 compute-1 nova_compute[238822]: 2025-09-30 18:55:05.917 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:55:05 compute-1 nova_compute[238822]: 2025-09-30 18:55:05.918 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.569s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:55:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:05.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:06 compute-1 ceph-mon[75484]: pgmap v2308: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:06 compute-1 nova_compute[238822]: 2025-09-30 18:55:06.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:06 compute-1 nova_compute[238822]: 2025-09-30 18:55:06.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3536301855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:07 compute-1 nova_compute[238822]: 2025-09-30 18:55:07.917 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:07 compute-1 nova_compute[238822]: 2025-09-30 18:55:07.918 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:07 compute-1 nova_compute[238822]: 2025-09-30 18:55:07.918 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:07 compute-1 nova_compute[238822]: 2025-09-30 18:55:07.918 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:55:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:55:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:07.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:07 compute-1 ovn_controller[135204]: 2025-09-30T18:55:07Z|00334|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Sep 30 18:55:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:55:08 compute-1 ceph-mon[75484]: pgmap v2309: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:08.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1414025965' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:09.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:10 compute-1 nova_compute[238822]: 2025-09-30 18:55:10.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:10 compute-1 ceph-mon[75484]: pgmap v2310: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:10.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:11 compute-1 unix_chkpwd[309507]: password check failed for user (root)
Sep 30 18:55:11 compute-1 sshd-session[309503]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:11 compute-1 nova_compute[238822]: 2025-09-30 18:55:11.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:11 compute-1 nova_compute[238822]: 2025-09-30 18:55:11.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:11 compute-1 ceph-mon[75484]: pgmap v2311: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:11.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:12.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:13 compute-1 sshd-session[309503]: Failed password for root from 192.210.160.141 port 54358 ssh2
Sep 30 18:55:13 compute-1 ceph-mon[75484]: pgmap v2312: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:13.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:14 compute-1 nova_compute[238822]: 2025-09-30 18:55:14.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:14 compute-1 sshd-session[309503]: Connection closed by authenticating user root 192.210.160.141 port 54358 [preauth]
Sep 30 18:55:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:14.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:15 compute-1 ceph-mon[75484]: pgmap v2313: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:15.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:16 compute-1 nova_compute[238822]: 2025-09-30 18:55:16.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:16 compute-1 nova_compute[238822]: 2025-09-30 18:55:16.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:16.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:16 compute-1 nova_compute[238822]: 2025-09-30 18:55:16.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:17 compute-1 ceph-mon[75484]: pgmap v2314: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:17.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:18.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: ERROR   18:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: ERROR   18:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: ERROR   18:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: ERROR   18:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: ERROR   18:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:55:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:55:19 compute-1 ceph-mon[75484]: pgmap v2315: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:19.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:20.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:21 compute-1 sudo[309517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:55:21 compute-1 sudo[309517]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:21 compute-1 sudo[309517]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:21 compute-1 nova_compute[238822]: 2025-09-30 18:55:21.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:21 compute-1 sudo[309543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:55:21 compute-1 sudo[309543]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:21 compute-1 nova_compute[238822]: 2025-09-30 18:55:21.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:21 compute-1 nova_compute[238822]: 2025-09-30 18:55:21.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:21 compute-1 sudo[309543]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:55:21 compute-1 ceph-mon[75484]: pgmap v2316: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 512 B/s rd, 0 op/s
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:55:21 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:55:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:21.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:22.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:23 compute-1 ceph-mon[75484]: pgmap v2317: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 768 B/s rd, 0 op/s
Sep 30 18:55:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:23.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:55:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:24.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:24 compute-1 sudo[309601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:55:24 compute-1 sudo[309601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:24 compute-1 sudo[309601]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:25 compute-1 ceph-mon[75484]: pgmap v2318: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 512 B/s rd, 0 op/s
Sep 30 18:55:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:55:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:25.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:26 compute-1 nova_compute[238822]: 2025-09-30 18:55:26.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:26.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:26 compute-1 nova_compute[238822]: 2025-09-30 18:55:26.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:26 compute-1 nova_compute[238822]: 2025-09-30 18:55:26.565 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:26 compute-1 sudo[309628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:55:26 compute-1 sudo[309628]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:26 compute-1 sudo[309628]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:26 compute-1 sshd-session[309651]: Accepted publickey for zuul from 192.168.122.10 port 54348 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 18:55:26 compute-1 systemd-logind[789]: New session 61 of user zuul.
Sep 30 18:55:26 compute-1 systemd[1]: Started Session 61 of User zuul.
Sep 30 18:55:26 compute-1 sshd-session[309651]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 18:55:27 compute-1 sudo[309658]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 18:55:27 compute-1 sudo[309658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:55:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:55:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:27.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:28.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:28 compute-1 ceph-mon[75484]: pgmap v2319: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 512 B/s rd, 0 op/s
Sep 30 18:55:29 compute-1 nova_compute[238822]: 2025-09-30 18:55:29.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:29 compute-1 nova_compute[238822]: 2025-09-30 18:55:29.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 18:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:29 compute-1 podman[309813]: 2025-09-30 18:55:29.581712419 +0000 UTC m=+0.104739012 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 18:55:29 compute-1 podman[309812]: 2025-09-30 18:55:29.630588721 +0000 UTC m=+0.154204320 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 18:55:29 compute-1 ceph-mon[75484]: from='client.18956 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:29 compute-1 ceph-mon[75484]: pgmap v2320: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 768 B/s rd, 0 op/s
Sep 30 18:55:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:29.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:30.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:30 compute-1 ceph-mon[75484]: from='client.27345 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:30 compute-1 ceph-mon[75484]: from='client.18960 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:30 compute-1 ceph-mon[75484]: from='client.27349 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/21381944' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:55:31 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "status"} v 0)
Sep 30 18:55:31 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2986207997' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:31 compute-1 nova_compute[238822]: 2025-09-30 18:55:31.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:31 compute-1 nova_compute[238822]: 2025-09-30 18:55:31.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:31 compute-1 podman[309963]: 2025-09-30 18:55:31.53704503 +0000 UTC m=+0.076328430 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 18:55:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2986207997' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:55:31 compute-1 ceph-mon[75484]: pgmap v2321: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 512 B/s rd, 0 op/s
Sep 30 18:55:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:31.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:32.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:33 compute-1 ceph-mon[75484]: pgmap v2322: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:33.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:34.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:34 compute-1 ovs-vsctl[310063]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 18:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:35 compute-1 podman[249638]: time="2025-09-30T18:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:55:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:55:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8371 "" "Go-http-client/1.1"
Sep 30 18:55:35 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 18:55:35 compute-1 ceph-mon[75484]: pgmap v2323: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:35 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 18:55:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:35.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:36 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 18:55:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:36 compute-1 nova_compute[238822]: 2025-09-30 18:55:36.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:36.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:36 compute-1 nova_compute[238822]: 2025-09-30 18:55:36.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:36 compute-1 podman[310256]: 2025-09-30 18:55:36.592425286 +0000 UTC m=+0.127831312 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 18:55:36 compute-1 podman[310262]: 2025-09-30 18:55:36.60187101 +0000 UTC m=+0.127898644 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Sep 30 18:55:36 compute-1 podman[310263]: 2025-09-30 18:55:36.613352988 +0000 UTC m=+0.129657291 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:55:36 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: cache status {prefix=cache status} (starting...)
Sep 30 18:55:36 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:36 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: client ls {prefix=client ls} (starting...)
Sep 30 18:55:36 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1109390619' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:55:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1109390619' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:55:37 compute-1 lvm[310462]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 18:55:37 compute-1 lvm[310462]: VG ceph_vg0 finished
Sep 30 18:55:37 compute-1 kernel: block dm-0: the capability attribute has been deprecated.
Sep 30 18:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: damage ls {prefix=damage ls} (starting...)
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump loads {prefix=dump loads} (starting...)
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:37 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "report"} v 0)
Sep 30 18:55:37 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3782359562' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 18:55:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:55:37 compute-1 ceph-mon[75484]: from='client.27359 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:37 compute-1 ceph-mon[75484]: pgmap v2324: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3782359562' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 18:55:37 compute-1 ceph-mon[75484]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Sep 30 18:55:37 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:37.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Sep 30 18:55:38 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3577071448' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:55:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:55:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:38.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:38 compute-1 unix_chkpwd[310712]: password check failed for user (root)
Sep 30 18:55:38 compute-1 sshd-session[310265]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:55:38 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config log"} v 0)
Sep 30 18:55:38 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4270488756' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: get subtrees {prefix=get subtrees} (starting...)
Sep 30 18:55:38 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:38 compute-1 ceph-mon[75484]: from='client.27367 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3577071448' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:55:38 compute-1 ceph-mon[75484]: from='client.27375 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4270488756' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: ops {prefix=ops} (starting...)
Sep 30 18:55:39 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config-key dump"} v 0)
Sep 30 18:55:39 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1463308040' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 18:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Sep 30 18:55:39 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/664134983' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: session ls {prefix=session ls} (starting...)
Sep 30 18:55:39 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 18:55:39 compute-1 ceph-mon[75484]: from='client.27383 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1463308040' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mon[75484]: from='client.27395 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/664134983' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 18:55:39 compute-1 ceph-mon[75484]: pgmap v2325: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:40.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:40 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: status {prefix=status} (starting...)
Sep 30 18:55:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump"} v 0)
Sep 30 18:55:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3321485735' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:55:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:40 compute-1 sshd-session[310265]: Failed password for root from 192.210.160.141 port 39564 ssh2
Sep 30 18:55:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:40.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "features"} v 0)
Sep 30 18:55:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/196665390' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 18:55:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Sep 30 18:55:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/844676723' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:55:40 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Sep 30 18:55:40 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1698590819' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.27399 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3321485735' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/196665390' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/844676723' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1698590819' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Sep 30 18:55:41 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1737613640' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:41 compute-1 nova_compute[238822]: 2025-09-30 18:55:41.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:41 compute-1 nova_compute[238822]: 2025-09-30 18:55:41.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr services"} v 0)
Sep 30 18:55:41 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3597903295' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 18:55:41 compute-1 sshd-session[310265]: Connection closed by authenticating user root 192.210.160.141 port 39564 [preauth]
Sep 30 18:55:41 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Sep 30 18:55:41 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2751400103' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 18:55:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:42.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr stat"} v 0)
Sep 30 18:55:42 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3416236125' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 18:55:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1737613640' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:55:42 compute-1 ceph-mon[75484]: from='client.27423 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3597903295' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 18:55:42 compute-1 ceph-mon[75484]: pgmap v2326: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2751400103' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 18:55:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:42.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Sep 30 18:55:42 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/601627967' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 18:55:42 compute-1 nova_compute[238822]: 2025-09-30 18:55:42.570 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:55:42 compute-1 nova_compute[238822]: 2025-09-30 18:55:42.570 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 18:55:42 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump"} v 0)
Sep 30 18:55:42 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1231668273' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:55:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3416236125' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 18:55:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4247926951' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 18:55:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/601627967' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 18:55:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1231668273' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Sep 30 18:55:43 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2571273096' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:55:43 compute-1 nova_compute[238822]: 2025-09-30 18:55:43.610 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 18:55:43 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Sep 30 18:55:43 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2007948442' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:55:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:44.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:15.632293+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 150478848 unmapped: 75489280 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:16.632454+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:17.632737+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:18.632909+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:19.633151+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1795549 data_alloc: 218103808 data_used: 11223040
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:20.633340+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:21.633544+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151232512 unmapped: 74735616 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:22.633740+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151240704 unmapped: 74727424 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:23.633903+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151240704 unmapped: 74727424 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:24.634125+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151248896 unmapped: 74719232 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1795549 data_alloc: 218103808 data_used: 11223040
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:25.634297+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 151248896 unmapped: 74719232 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.211411476s of 11.215602875s, submitted: 1
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ed1000/0x0/0x4ffc00000, data 0x2b8a4e3/0x2c5b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [0,0,0,1,1])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:26.634464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158072832 unmapped: 67895296 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:27.634753+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158187520 unmapped: 67780608 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f650d000/0x0/0x4ffc00000, data 0x354d4e3/0x361e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45000 session 0x556f361fb860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:28.634938+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158908416 unmapped: 67059712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:29.635118+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158908416 unmapped: 67059712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1996313 data_alloc: 234881024 data_used: 12161024
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:30.635301+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158908416 unmapped: 67059712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:31.635433+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158908416 unmapped: 67059712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35103800 session 0x556f35c34960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:32.635602+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f357fd400 session 0x556f350710e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 158908416 unmapped: 67059712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f55fd000/0x0/0x4ffc00000, data 0x445d4e3/0x452e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:33.635835+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159047680 unmapped: 66920448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33efa800 session 0x556f3588ba40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489c00 session 0x556f32f4fc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:34.636008+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159047680 unmapped: 66920448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1992753 data_alloc: 234881024 data_used: 12161024
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:35.636185+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159047680 unmapped: 66920448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:36.636349+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159047680 unmapped: 66920448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34000 session 0x556f35b57e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:37.636672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159072256 unmapped: 66895872 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f32604780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:38.636824+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a488c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.467722893s of 12.817793846s, submitted: 127
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a488c00 session 0x556f326041e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159399936 unmapped: 66568192 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f55dd000/0x0/0x4ffc00000, data 0x447e4e3/0x454f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:39.637089+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159408128 unmapped: 66560000 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1998916 data_alloc: 234881024 data_used: 12165120
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:40.637295+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159391744 unmapped: 66576384 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:41.637440+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165994496 unmapped: 59973632 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:42.637591+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f55b5000/0x0/0x4ffc00000, data 0x44a54f3/0x4577000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:43.637803+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:44.638058+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2093612 data_alloc: 234881024 data_used: 26202112
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:45.638235+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f55b2000/0x0/0x4ffc00000, data 0x44a84f3/0x457a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:46.638405+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:47.638651+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:48.638821+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f55b2000/0x0/0x4ffc00000, data 0x44a84f3/0x457a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:49.639007+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2094292 data_alloc: 234881024 data_used: 26202112
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:50.639144+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166354944 unmapped: 59613184 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.535052299s of 12.577404976s, submitted: 10
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:51.639286+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170172416 unmapped: 55795712 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:52.639430+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5013000/0x0/0x4ffc00000, data 0x4a394f3/0x4b0b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170459136 unmapped: 55508992 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:53.639545+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170704896 unmapped: 55263232 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:54.639701+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170704896 unmapped: 55263232 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2161538 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:55.639827+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170713088 unmapped: 55255040 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:56.639971+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170713088 unmapped: 55255040 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4ff1000/0x0/0x4ffc00000, data 0x4a524f3/0x4b24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:57.640167+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170729472 unmapped: 55238656 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:58.640390+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:24:59.640587+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:00.640949+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:01.641138+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:02.641360+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:03.641561+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:04.641773+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:05.641942+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:06.642129+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:07.642327+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:08.642483+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:09.642694+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:10.642901+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:11.643086+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:12.643303+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169205760 unmapped: 56762368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:13.643690+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.112754822s of 22.364448547s, submitted: 78
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169213952 unmapped: 56754176 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:14.643856+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169213952 unmapped: 56754176 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:15.644090+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169213952 unmapped: 56754176 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:16.644342+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169222144 unmapped: 56745984 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:17.644583+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169222144 unmapped: 56745984 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:18.644743+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169238528 unmapped: 56729600 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:19.644827+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169238528 unmapped: 56729600 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:20.644963+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169238528 unmapped: 56729600 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:21.645113+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169246720 unmapped: 56721408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:22.645304+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169246720 unmapped: 56721408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:23.645468+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169246720 unmapped: 56721408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:24.645612+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169246720 unmapped: 56721408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153762 data_alloc: 234881024 data_used: 27078656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:25.645852+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169246720 unmapped: 56721408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:26.646014+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169263104 unmapped: 56705024 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:27.646177+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169263104 unmapped: 56705024 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:28.646341+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5006000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.235325813s of 15.240333557s, submitted: 1
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169263104 unmapped: 56705024 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:29.646520+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489c00 session 0x556f32f483c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45000 session 0x556f35074000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169271296 unmapped: 56696832 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153058 data_alloc: 234881024 data_used: 27086848
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5007000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:30.646696+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169271296 unmapped: 56696832 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:31.646865+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5007000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169271296 unmapped: 56696832 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:32.647003+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169279488 unmapped: 56688640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:33.647191+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169287680 unmapped: 56680448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5007000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:34.647405+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169287680 unmapped: 56680448 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2153058 data_alloc: 234881024 data_used: 27086848
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:35.647599+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169295872 unmapped: 56672256 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:36.647819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169295872 unmapped: 56672256 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5007000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:37.648028+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169295872 unmapped: 56672256 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:38.648187+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169295872 unmapped: 56672256 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.199773788s of 10.208640099s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:39.648395+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170360832 unmapped: 55607296 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2155562 data_alloc: 234881024 data_used: 27074560
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:40.648560+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170360832 unmapped: 55607296 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:41.648759+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170369024 unmapped: 55599104 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5007000/0x0/0x4ffc00000, data 0x4a534f3/0x4b25000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34400 session 0x556f350741e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f343905a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:42.648925+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170385408 unmapped: 55582720 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:43.649094+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f3595bc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:44.649259+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6468000/0x0/0x4ffc00000, data 0x35f34e3/0x36c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1895377 data_alloc: 234881024 data_used: 12161024
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:45.649438+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:46.649664+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6468000/0x0/0x4ffc00000, data 0x35f34e3/0x36c4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:47.649872+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:48.650055+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.033616066s of 10.298776627s, submitted: 43
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f350743c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35ef0b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:49.650274+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 161087488 unmapped: 64880640 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a488c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1687659 data_alloc: 218103808 data_used: 2777088
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:50.650442+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a488c00 session 0x556f35c37860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:51.650612+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:52.650847+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:53.651060+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:54.651285+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1679966 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:55.651475+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:56.651699+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:57.651900+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:58.652057+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:25:59.652236+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1679966 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:00.652432+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:01.652603+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:02.652821+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:03.653051+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:04.653230+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1679966 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:05.653442+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:06.653701+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:07.653893+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:08.654055+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:09.654223+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1679966 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:10.654383+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:11.654580+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:12.654729+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:13.654938+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:14.655053+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1679966 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:15.655258+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:16.655462+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:17.655707+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7892000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:18.655869+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f34390780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f35b56b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f3580c000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f35071680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 155607040 unmapped: 70361088 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 29.767107010s of 29.863809586s, submitted: 31
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489c00 session 0x556f35c36f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f35989c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f361fb860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:19.656012+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35609a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f33c18960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f732b000/0x0/0x4ffc00000, data 0x272f50c/0x2801000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1772727 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:20.656148+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:21.659268+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:22.661940+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35000 session 0x556f3595a3c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6ce7000/0x0/0x4ffc00000, data 0x2d73545/0x2e45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:23.662281+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:24.663558+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f3594b860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1772727 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:25.664563+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 71671808 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f35969c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:26.665316+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35968f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 71598080 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:27.666020+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6cbe000/0x0/0x4ffc00000, data 0x2d9a578/0x2e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 71598080 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:28.666258+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6cbe000/0x0/0x4ffc00000, data 0x2d9a578/0x2e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:29.666899+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:30.667467+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1852135 data_alloc: 234881024 data_used: 13553664
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6cbe000/0x0/0x4ffc00000, data 0x2d9a578/0x2e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:31.667977+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:32.668424+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:33.668813+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:34.669245+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:35.669541+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1852135 data_alloc: 234881024 data_used: 13553664
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6cbe000/0x0/0x4ffc00000, data 0x2d9a578/0x2e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:36.669809+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156205056 unmapped: 69763072 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:37.670167+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.004871368s of 19.175771713s, submitted: 52
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156213248 unmapped: 69754880 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:38.670404+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6cbe000/0x0/0x4ffc00000, data 0x2d9a578/0x2e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162734080 unmapped: 63234048 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:39.670767+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f610a000/0x0/0x4ffc00000, data 0x394e578/0x3a22000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162398208 unmapped: 63569920 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:40.671079+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1960615 data_alloc: 234881024 data_used: 13783040
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f3595a960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35800 session 0x556f3662a3c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f35609680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f35608f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f3662b680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162766848 unmapped: 63201280 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f35c37a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126400 session 0x556f35cba1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:41.671211+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f35cd0b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f34391e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5726000/0x0/0x4ffc00000, data 0x4330588/0x4405000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:42.671564+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:43.671911+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:44.672173+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:45.672526+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2036054 data_alloc: 234881024 data_used: 13783040
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:46.672849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5703000/0x0/0x4ffc00000, data 0x4354588/0x4429000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:47.673053+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f36632960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:48.673323+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:49.673553+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5703000/0x0/0x4ffc00000, data 0x4354588/0x4429000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f35988780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:50.673920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2033638 data_alloc: 234881024 data_used: 13787136
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 162783232 unmapped: 63184896 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34127000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34127000 session 0x556f35969a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:51.674181+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.811079979s of 13.252976418s, submitted: 134
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f3579d860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 163110912 unmapped: 62857216 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:52.674349+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 163119104 unmapped: 62849024 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:53.674805+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 163127296 unmapped: 62840832 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:54.674962+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f56db000/0x0/0x4ffc00000, data 0x437b598/0x4451000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:55.675353+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2103328 data_alloc: 234881024 data_used: 22757376
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:56.675950+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:57.676367+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:58.676755+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f56d8000/0x0/0x4ffc00000, data 0x437e598/0x4454000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:26:59.677184+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:00.677579+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f56d8000/0x0/0x4ffc00000, data 0x437e598/0x4454000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2104008 data_alloc: 234881024 data_used: 22757376
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:01.678198+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f56d8000/0x0/0x4ffc00000, data 0x437e598/0x4454000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:02.678356+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166518784 unmapped: 59449344 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:03.678538+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.587929726s of 12.613903046s, submitted: 5
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170336256 unmapped: 55631872 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:04.678708+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172679168 unmapped: 53288960 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:05.678863+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2199082 data_alloc: 234881024 data_used: 23810048
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172720128 unmapped: 53248000 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:06.679059+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172720128 unmapped: 53248000 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:07.679323+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c05000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172941312 unmapped: 53026816 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:08.679468+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172941312 unmapped: 53026816 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:09.679635+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:10.679758+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 172949504 unmapped: 53018624 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191538 data_alloc: 234881024 data_used: 23887872
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c05000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:11.679971+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:12.680161+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:13.680502+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:14.680802+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:15.681187+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191706 data_alloc: 234881024 data_used: 23887872
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:16.681407+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:17.681708+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:18.681885+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:19.682088+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:20.682253+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191706 data_alloc: 234881024 data_used: 23887872
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:21.682448+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:22.682691+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:23.683001+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:24.683226+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:25.684131+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191706 data_alloc: 234881024 data_used: 23887872
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:26.684281+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.194272995s of 22.473934174s, submitted: 91
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:27.684767+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171245568 unmapped: 54722560 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f33c401e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35400 session 0x556f339cc3c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:28.685016+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:29.685231+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:30.685513+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191266 data_alloc: 234881024 data_used: 23896064
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:31.685670+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:32.686003+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:33.686241+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:34.686448+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:35.686657+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171253760 unmapped: 54714368 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191266 data_alloc: 234881024 data_used: 23896064
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0d000/0x0/0x4ffc00000, data 0x4e49598/0x4f1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:36.686820+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171278336 unmapped: 54689792 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:37.687032+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171278336 unmapped: 54689792 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:38.687190+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171278336 unmapped: 54689792 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:39.687334+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171278336 unmapped: 54689792 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:40.687526+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0b000/0x0/0x4ffc00000, data 0x4e4a598/0x4f20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171294720 unmapped: 54673408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2191906 data_alloc: 234881024 data_used: 23896064
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:41.687815+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171294720 unmapped: 54673408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0b000/0x0/0x4ffc00000, data 0x4e4a598/0x4f20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:42.688035+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171294720 unmapped: 54673408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35b805a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c0b000/0x0/0x4ffc00000, data 0x4e4a598/0x4f20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:43.688225+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171294720 unmapped: 54673408 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.032264709s of 17.111976624s, submitted: 6
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34342400 session 0x556f34c8e780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35b49860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:44.688487+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171302912 unmapped: 54665216 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32b1fe00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:45.688672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972475 data_alloc: 234881024 data_used: 13045760
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:46.688874+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6040000/0x0/0x4ffc00000, data 0x3a18578/0x3aec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:47.689160+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:48.689324+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:49.689564+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:50.689875+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972475 data_alloc: 234881024 data_used: 13045760
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f6040000/0x0/0x4ffc00000, data 0x3a18578/0x3aec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x60cf9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:51.690103+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167976960 unmapped: 57991168 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f35b81c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f366332c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:52.690320+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3000.1 total, 600.0 interval
                                           Cumulative writes: 16K writes, 62K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.02 MB/s
                                           Cumulative WAL: 16K writes, 4761 syncs, 3.37 writes per sync, written: 0.05 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3022 writes, 11K keys, 3022 commit groups, 1.0 writes per commit group, ingest: 12.61 MB, 0.02 MB/s
                                           Interval WAL: 3022 writes, 1192 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167985152 unmapped: 57982976 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:53.690467+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34342400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 68403200 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f3594a960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:54.690776+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157589504 unmapped: 68378624 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:55.690985+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157589504 unmapped: 68378624 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1713900 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:56.691213+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.299837112s of 12.922321320s, submitted: 85
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7459000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:57.691448+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:58.691669+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:27:59.691805+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7482000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:00.692009+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1714844 data_alloc: 218103808 data_used: 2654208
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:01.692242+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:02.692445+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:03.692687+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:04.693033+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:05.693238+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7482000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1714844 data_alloc: 218103808 data_used: 2654208
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:06.693504+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7482000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:07.693818+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157605888 unmapped: 68362240 heap: 225968128 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.741424561s of 11.750540733s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:08.694004+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f35cba000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f358885a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f35cbad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f32fc41e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f32f53680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7482000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:09.694327+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:10.694608+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c8000/0x0/0x4ffc00000, data 0x2f844d3/0x3054000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f339cdc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1823243 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:11.694859+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:12.695055+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c8000/0x0/0x4ffc00000, data 0x2f844d3/0x3054000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f36632780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:13.695272+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156663808 unmapped: 72982528 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f35075680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:14.695493+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f350752c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156672000 unmapped: 72974336 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:15.695726+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 156680192 unmapped: 72966144 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1825669 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:16.695875+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 157138944 unmapped: 72507392 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:17.696086+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c7000/0x0/0x4ffc00000, data 0x2f844f6/0x3055000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:18.696388+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c7000/0x0/0x4ffc00000, data 0x2f844f6/0x3055000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c7000/0x0/0x4ffc00000, data 0x2f844f6/0x3055000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:19.696606+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:20.696896+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1922493 data_alloc: 234881024 data_used: 16953344
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:21.697127+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:22.697329+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:23.697480+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f66c7000/0x0/0x4ffc00000, data 0x2f844f6/0x3055000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:24.697672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:25.697818+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 159039488 unmapped: 70606848 heap: 229646336 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1922949 data_alloc: 234881024 data_used: 16965632
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.422401428s of 17.641061783s, submitted: 42
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f37874c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:26.698053+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f37874c00 session 0x556f32f463c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168681472 unmapped: 64634880 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35075c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f33c9f680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f359894a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f32f4ed20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:27.698334+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168714240 unmapped: 64602112 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:28.698531+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:29.698743+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f48bc000/0x0/0x4ffc00000, data 0x4d86558/0x4e58000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:30.698953+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2147551 data_alloc: 234881024 data_used: 17416192
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:31.699162+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c400 session 0x556f32f545a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:32.699307+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f48bc000/0x0/0x4ffc00000, data 0x4d86558/0x4e58000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:33.699489+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:34.699801+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f359694a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:35.700005+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170115072 unmapped: 63201280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2147551 data_alloc: 234881024 data_used: 17416192
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:36.700276+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f366321e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.847031593s of 10.647036552s, submitted: 171
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f35ef05a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170459136 unmapped: 62857216 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:37.700522+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170483712 unmapped: 62832640 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:38.700690+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f489d000/0x0/0x4ffc00000, data 0x4dad558/0x4e7f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 171311104 unmapped: 62005248 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:39.700821+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:40.700966+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2242040 data_alloc: 251658240 data_used: 31428608
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:41.701103+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:42.701249+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f489d000/0x0/0x4ffc00000, data 0x4dad558/0x4e7f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:43.701392+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:44.701577+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:45.701662+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2242040 data_alloc: 251658240 data_used: 31428608
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:46.701805+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:47.701981+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180428800 unmapped: 52887552 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:48.702204+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.961108208s of 12.002876282s, submitted: 8
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180527104 unmapped: 52789248 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b48000/0x0/0x4ffc00000, data 0x5b02558/0x5bd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:49.702486+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 181821440 unmapped: 51494912 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:50.702684+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183230464 unmapped: 50085888 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2359798 data_alloc: 251658240 data_used: 31944704
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:51.702865+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183230464 unmapped: 50085888 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:52.703009+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b1c000/0x0/0x4ffc00000, data 0x5b2e558/0x5c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183230464 unmapped: 50085888 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:53.703243+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183263232 unmapped: 50053120 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:54.703518+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b1c000/0x0/0x4ffc00000, data 0x5b2e558/0x5c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183263232 unmapped: 50053120 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:55.703678+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b1c000/0x0/0x4ffc00000, data 0x5b2e558/0x5c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2354486 data_alloc: 251658240 data_used: 31948800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:56.703932+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:57.704184+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:58.704432+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:28:59.704858+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:00.705147+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2354334 data_alloc: 251658240 data_used: 31944704
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:01.705405+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:02.705606+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:03.705859+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:04.706128+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:05.706314+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183328768 unmapped: 49987584 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2354334 data_alloc: 251658240 data_used: 31944704
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:06.706516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.582679749s of 17.874309540s, submitted: 112
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd3400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183336960 unmapped: 49979392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:07.706766+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183336960 unmapped: 49979392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f32f49860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35400 session 0x556f3662b4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:08.707038+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183345152 unmapped: 49971200 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:09.707188+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183345152 unmapped: 49971200 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:10.707312+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183361536 unmapped: 49954816 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2353906 data_alloc: 251658240 data_used: 31944704
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:11.707476+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183361536 unmapped: 49954816 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:12.707672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183369728 unmapped: 49946624 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:13.707827+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183369728 unmapped: 49946624 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:14.708007+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183386112 unmapped: 49930240 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:15.708187+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183386112 unmapped: 49930240 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2356502 data_alloc: 251658240 data_used: 31932416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:16.708357+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183394304 unmapped: 49922048 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:17.708564+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183427072 unmapped: 49889280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:18.708726+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183427072 unmapped: 49889280 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:19.709485+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183468032 unmapped: 49848320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:20.709688+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183468032 unmapped: 49848320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2356502 data_alloc: 251658240 data_used: 31932416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.647117615s of 14.680272102s, submitted: 20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:21.709904+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3b19000/0x0/0x4ffc00000, data 0x5b31558/0x5c03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35e394a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f32f483c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183508992 unmapped: 49807360 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:22.710121+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f550e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:23.710288+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:24.710518+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:25.710761+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2061990 data_alloc: 234881024 data_used: 17399808
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:26.710922+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f53f2000/0x0/0x4ffc00000, data 0x402a4f6/0x40fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f53f2000/0x0/0x4ffc00000, data 0x402a4f6/0x40fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:27.711181+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:28.711442+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f35609a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd3400 session 0x556f35988f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:29.711636+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175226880 unmapped: 58089472 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:30.711772+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35074b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5621000/0x0/0x4ffc00000, data 0x402a4f6/0x40fb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1747698 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:31.712052+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:32.712272+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:33.712474+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:34.712729+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:35.712918+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1747698 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:36.713110+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:37.713309+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7481000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:38.713477+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:39.713674+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 164691968 unmapped: 68624384 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:40.713825+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.216674805s of 19.657211304s, submitted: 113
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165765120 unmapped: 67551232 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1747406 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:41.713984+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165765120 unmapped: 67551232 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:42.714134+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f7482000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 67502080 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:43.714303+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165912576 unmapped: 67403776 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:44.714531+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165912576 unmapped: 67403776 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:45.714702+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 165912576 unmapped: 67403776 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1747406 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:46.714882+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f3518de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f3518cf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f3518d4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32618800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32618800 session 0x556f33c19860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f326052c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166289408 unmapped: 67026944 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f3579dc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f3579d680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f35cbb2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34126000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34126000 session 0x556f35cbad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:47.715102+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:48.715293+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69d1000/0x0/0x4ffc00000, data 0x2c79545/0x2d4b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:49.715462+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:50.715710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1834043 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:51.715902+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cba3c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69d1000/0x0/0x4ffc00000, data 0x2c79545/0x2d4b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:52.716129+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:53.716356+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f34390000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:54.716565+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166297600 unmapped: 67018752 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:55.716758+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f34391e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.666287422s of 14.798958778s, submitted: 305
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f34390b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166559744 unmapped: 66756608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1835627 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69aa000/0x0/0x4ffc00000, data 0x2ca0545/0x2d72000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:56.716899+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 166559744 unmapped: 66756608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:57.717092+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 167698432 unmapped: 65617920 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:58.717327+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168681472 unmapped: 64634880 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:29:59.717549+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168681472 unmapped: 64634880 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:00.717736+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69aa000/0x0/0x4ffc00000, data 0x2ca0545/0x2d72000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168681472 unmapped: 64634880 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1910411 data_alloc: 234881024 data_used: 13582336
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:01.717990+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:02.718177+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69aa000/0x0/0x4ffc00000, data 0x2ca0545/0x2d72000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:03.718407+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:04.718672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f69aa000/0x0/0x4ffc00000, data 0x2ca0545/0x2d72000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x64df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:05.718819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1910411 data_alloc: 234881024 data_used: 13582336
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:06.718992+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 168689664 unmapped: 64626688 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.639043808s of 11.646484375s, submitted: 1
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:07.719138+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175382528 unmapped: 57933824 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f3595ad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3662b860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35968000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f35968f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:08.719279+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f35969c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f35969680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cd1680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f3579cd20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f35e385a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174538752 unmapped: 58777600 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:09.719406+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fda000/0x0/0x4ffc00000, data 0x44cf555/0x45a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174538752 unmapped: 58777600 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fda000/0x0/0x4ffc00000, data 0x44cf555/0x45a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:10.719593+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174538752 unmapped: 58777600 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2111877 data_alloc: 234881024 data_used: 15482880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:11.719810+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fda000/0x0/0x4ffc00000, data 0x44cf555/0x45a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174538752 unmapped: 58777600 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:12.719975+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174538752 unmapped: 58777600 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:13.720155+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fda000/0x0/0x4ffc00000, data 0x44cf555/0x45a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174497792 unmapped: 58818560 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:14.720313+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:15.720515+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2111877 data_alloc: 234881024 data_used: 15482880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:16.720711+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fbb000/0x0/0x4ffc00000, data 0x44ee555/0x45c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:17.720924+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f35969860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:18.721114+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:19.721315+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fbb000/0x0/0x4ffc00000, data 0x44ee555/0x45c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.256416321s of 12.646794319s, submitted: 143
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe4800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe4800 session 0x556f35b48f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:20.721453+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fb1000/0x0/0x4ffc00000, data 0x44f8555/0x45cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f523c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2112645 data_alloc: 234881024 data_used: 15482880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:21.721612+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f342f83c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174505984 unmapped: 58810368 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:22.721798+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174522368 unmapped: 58793984 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:23.721987+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174129152 unmapped: 59187200 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:24.722149+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fb1000/0x0/0x4ffc00000, data 0x44f8555/0x45cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:25.722301+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2202082 data_alloc: 234881024 data_used: 28168192
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:26.722499+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:27.722683+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:28.722828+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:29.722949+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:30.723081+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fae000/0x0/0x4ffc00000, data 0x44fb555/0x45ce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:31.723212+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2202474 data_alloc: 234881024 data_used: 28168192
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fae000/0x0/0x4ffc00000, data 0x44fb555/0x45ce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3fae000/0x0/0x4ffc00000, data 0x44fb555/0x45ce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:32.723341+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179109888 unmapped: 54206464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:33.723516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.873428345s of 13.934272766s, submitted: 10
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183443456 unmapped: 49872896 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:34.723736+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183672832 unmapped: 49643520 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:35.723939+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184901632 unmapped: 48414720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:36.724128+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2282092 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f367c000/0x0/0x4ffc00000, data 0x4e27555/0x4efa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184909824 unmapped: 48406528 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:37.724381+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f367c000/0x0/0x4ffc00000, data 0x4e27555/0x4efa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184909824 unmapped: 48406528 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:38.724772+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184909824 unmapped: 48406528 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:39.725011+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f367c000/0x0/0x4ffc00000, data 0x4e27555/0x4efa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184918016 unmapped: 48398336 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:40.725230+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184918016 unmapped: 48398336 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:41.725445+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280940 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184918016 unmapped: 48398336 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:42.725667+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184918016 unmapped: 48398336 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:43.725835+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184918016 unmapped: 48398336 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:44.726031+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365e000/0x0/0x4ffc00000, data 0x4e4b555/0x4f1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184926208 unmapped: 48390144 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:45.726187+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184926208 unmapped: 48390144 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:46.726334+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280940 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365e000/0x0/0x4ffc00000, data 0x4e4b555/0x4f1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184926208 unmapped: 48390144 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:47.726537+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184934400 unmapped: 48381952 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:48.726728+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184934400 unmapped: 48381952 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:49.726941+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365e000/0x0/0x4ffc00000, data 0x4e4b555/0x4f1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:50.727188+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:51.727366+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280940 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:52.727571+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:53.727819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365e000/0x0/0x4ffc00000, data 0x4e4b555/0x4f1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.971578598s of 20.309820175s, submitted: 120
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:54.728009+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184942592 unmapped: 48373760 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:55.728204+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184958976 unmapped: 48357376 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:56.728375+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2281116 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184958976 unmapped: 48357376 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:57.728662+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184967168 unmapped: 48349184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:58.728852+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184967168 unmapped: 48349184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:30:59.729028+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184967168 unmapped: 48349184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:00.729148+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184967168 unmapped: 48349184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:01.729359+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2281116 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184967168 unmapped: 48349184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:02.729598+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184975360 unmapped: 48340992 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:03.729862+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184983552 unmapped: 48332800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:04.730022+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184983552 unmapped: 48332800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:05.730206+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.768951416s of 11.776761055s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184983552 unmapped: 48332800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:06.730406+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2281116 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184983552 unmapped: 48332800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:07.730655+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184991744 unmapped: 48324608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:08.730813+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184991744 unmapped: 48324608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:09.730988+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184991744 unmapped: 48324608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:10.731130+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184991744 unmapped: 48324608 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:11.731315+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2281116 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185008128 unmapped: 48308224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:12.731579+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe4400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185008128 unmapped: 48308224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:13.731792+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f33c9f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f36632000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185008128 unmapped: 48308224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:14.733727+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185008128 unmapped: 48308224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:15.733923+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185008128 unmapped: 48308224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:16.734092+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280612 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185016320 unmapped: 48300032 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:17.734269+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185016320 unmapped: 48300032 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:18.734391+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185016320 unmapped: 48300032 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:19.734528+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185024512 unmapped: 48291840 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:20.734680+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185024512 unmapped: 48291840 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:21.734818+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280612 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:22.735015+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185024512 unmapped: 48291840 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:23.735190+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185032704 unmapped: 48283648 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:24.735322+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185032704 unmapped: 48283648 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:25.735470+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185032704 unmapped: 48283648 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:26.735649+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185040896 unmapped: 48275456 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2280612 data_alloc: 234881024 data_used: 28364800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:27.735973+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185040896 unmapped: 48275456 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:28.736143+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185049088 unmapped: 48267264 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:29.736296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185049088 unmapped: 48267264 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe4400 session 0x556f35b48780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.712896347s of 23.732158661s, submitted: 5
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f35b56960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f365b000/0x0/0x4ffc00000, data 0x4e4e555/0x4f21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:30.736517+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185049088 unmapped: 48267264 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3518d860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:31.736719+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2028610 data_alloc: 234881024 data_used: 15478784
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:32.736899+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c4c000/0x0/0x4ffc00000, data 0x385e545/0x3930000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:33.737027+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4c4c000/0x0/0x4ffc00000, data 0x385e545/0x3930000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:34.737214+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:35.737373+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:36.737573+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 54894592 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f36632f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f342f8000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2028538 data_alloc: 234881024 data_used: 15478784
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34343400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:37.737840+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169607168 unmapped: 63709184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34343400 session 0x556f35b57c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:38.738022+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169607168 unmapped: 63709184 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:39.738230+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:40.738440+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:41.738675+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1779908 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:42.738911+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:43.739130+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:44.739366+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:45.739602+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:46.739885+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1779908 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:47.740133+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:48.740409+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:49.740735+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:50.740956+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:51.741156+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1779908 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:52.741365+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:53.741596+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f530e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f34c8e5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f339ccf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f35cd05a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.986249924s of 24.147270203s, submitted: 60
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f35cd01e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3594ba40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f35b48b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f32e043c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f3595a1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:54.741859+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:55.742055+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:56.742225+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1826835 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:57.742484+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:58.742692+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f3660c1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:59.742902+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:00.743077+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:01.743296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1826835 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3660c780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:02.743505+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f3660cb40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f3595ad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:03.743702+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169877504 unmapped: 63438848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.096412659s of 10.222342491s, submitted: 26
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:04.743953+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169877504 unmapped: 63438848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:05.744208+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:06.744413+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1866076 data_alloc: 218103808 data_used: 7753728
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:07.744717+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5dca000/0x0/0x4ffc00000, data 0x26e14e3/0x27b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:08.744891+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:09.745101+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:10.745310+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5dca000/0x0/0x4ffc00000, data 0x26e14e3/0x27b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:11.745516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1866076 data_alloc: 218103808 data_used: 7753728
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:12.745762+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:13.745972+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:14.746168+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.062009811s of 11.069645882s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:15.746338+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174768128 unmapped: 58548224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:16.746542+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995228 data_alloc: 218103808 data_used: 8646656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5800 session 0x556f35cd03c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bfc000/0x0/0x4ffc00000, data 0x37074e3/0x37d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:17.746748+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:18.746915+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:19.747075+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:20.747261+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:21.747445+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2011778 data_alloc: 218103808 data_used: 8646656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bfc000/0x0/0x4ffc00000, data 0x37074e3/0x37d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:22.747658+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:23.747868+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:24.748030+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:25.748221+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:26.748385+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3c01000/0x0/0x4ffc00000, data 0x370a4e3/0x37db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.202202797s of 11.535426140s, submitted: 133
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5000 session 0x556f359883c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175259648 unmapped: 58056704 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2009643 data_alloc: 218103808 data_used: 8646656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:27.748599+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:28.748819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:29.749004+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:30.749226+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:31.749394+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2013271 data_alloc: 218103808 data_used: 9179136
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:32.749730+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:33.749949+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:34.750142+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:35.750303+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:36.750506+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2013271 data_alloc: 218103808 data_used: 9179136
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:37.750804+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.478668213s of 11.507327080s, submitted: 7
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:38.750958+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:39.751128+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:40.751310+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3553000/0x0/0x4ffc00000, data 0x3db74e3/0x3e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:41.751529+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2079641 data_alloc: 218103808 data_used: 9338880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:42.751718+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:43.751936+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:44.778856+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:45.779019+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:46.779170+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076729 data_alloc: 218103808 data_used: 9338880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:47.779377+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:48.779539+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:49.779735+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:50.779956+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:51.780190+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:52.780463+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:53.780709+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:54.780889+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:55.781049+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:56.781216+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:57.781377+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:58.781549+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:59.781703+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:00.781836+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:01.782008+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:02.782175+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.340782166s of 24.475608826s, submitted: 50
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177979392 unmapped: 55336960 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:03.782332+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177979392 unmapped: 55336960 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3550000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:04.782517+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:05.782672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:06.782824+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2079033 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:07.783018+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:08.783130+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:09.783291+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3550000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:10.783453+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:11.783713+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2077153 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:12.783919+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:13.784097+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:14.784289+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:15.784454+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:16.784678+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2077153 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:17.784956+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:18.785177+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:19.785361+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.368265152s of 17.390548706s, submitted: 16
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:20.785494+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f3660de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5000 session 0x556f35cd14a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:21.785684+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:22.785880+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:23.786041+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:24.786264+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:25.786454+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:26.786828+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:27.787038+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:28.787268+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:29.787470+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:30.787680+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:31.787800+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:32.788179+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:33.788376+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:34.788592+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178053120 unmapped: 55263232 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:35.788843+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:36.788999+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:37.789207+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:38.789415+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:39.789766+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-mon[75484]: from='client.27445 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:44 compute-1 ceph-mon[75484]: from='client.27453 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2571273096' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:55:44 compute-1 ceph-mon[75484]: from='client.19004 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:44 compute-1 ceph-mon[75484]: pgmap v2327: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2007948442' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:40.789962+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:41.790170+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:42.790348+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:43.790547+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:44.790751+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:45.790954+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:46.791179+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.392583847s of 27.433944702s, submitted: 7
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:47.791460+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:48.791740+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35ef1680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5800 session 0x556f35c370e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:49.791945+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:50.792091+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:51.792234+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:52.792469+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:53.792692+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:54.792912+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:55.793126+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:56.793303+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:57.793499+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:58.793698+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:59.793834+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:00.794012+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:01.794176+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:02.794342+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:03.794508+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38303000 session 0x556f33e23e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.728868484s of 16.749923706s, submitted: 3
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f32f4f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:04.794766+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178094080 unmapped: 55222272 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:05.794920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38303000 session 0x556f35b56b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:06.795142+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2000073 data_alloc: 218103808 data_used: 8634368
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:07.795371+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:08.795560+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d97000/0x0/0x4ffc00000, data 0x35744e3/0x3645000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:09.795735+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:10.795907+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f35968780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f3579de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:11.796068+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d97000/0x0/0x4ffc00000, data 0x35744e3/0x3645000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178126848 unmapped: 55189504 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1814213 data_alloc: 218103808 data_used: 2777088
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cba000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:12.796210+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d98000/0x0/0x4ffc00000, data 0x35744d3/0x3644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:13.796353+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:14.796515+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:15.796710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:16.796877+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:17.797143+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:18.797308+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:19.797495+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:20.797712+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:21.797885+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:22.798085+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:23.798257+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:24.798437+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:25.798577+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:26.798749+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:27.798946+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:28.799105+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:29.799257+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:30.799414+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35ef1c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f361fb680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f36632f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35ef05a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:31.799520+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35815400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.056009293s of 27.376161575s, submitted: 74
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35815400 session 0x556f35b56960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1890859 data_alloc: 218103808 data_used: 2658304
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:32.799672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:33.799798+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:34.799954+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:35.800113+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:36.800307+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1891011 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:37.800569+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:38.800754+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:39.800927+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:40.801080+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:41.801255+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:42.801421+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1891011 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.036623001s of 11.114352226s, submitted: 13
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f550e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174587904 unmapped: 66600960 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:43.801647+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174587904 unmapped: 66600960 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f45e9000/0x0/0x4ffc00000, data 0x2d234d3/0x2df3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:44.801838+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174628864 unmapped: 66560000 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:45.801993+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:46.802164+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:47.802398+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972299 data_alloc: 234881024 data_used: 14307328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:48.802581+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:49.802771+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f45e9000/0x0/0x4ffc00000, data 0x2d234d3/0x2df3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:50.802918+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:51.803141+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:52.803290+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972299 data_alloc: 234881024 data_used: 14307328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:53.803515+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:54.803720+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.263117790s of 12.270321846s, submitted: 1
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 182788096 unmapped: 58400768 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:55.803876+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4122000/0x0/0x4ffc00000, data 0x31ea4d3/0x32ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [0,0,0,0,0,0,0,11])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178642944 unmapped: 62545920 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:56.804060+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35cd01e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f3594b4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f3595bc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35b49a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191184896 unmapped: 50003968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:57.804275+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2151535 data_alloc: 234881024 data_used: 15806464
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35969a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:58.804493+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:59.804721+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:00.804923+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:01.805094+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:02.805278+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2159251 data_alloc: 234881024 data_used: 15892480
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:03.805466+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:04.805699+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:05.805857+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:06.806064+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:07.806296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2159251 data_alloc: 234881024 data_used: 15892480
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:08.806415+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.111789703s of 13.742190361s, submitted: 80
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f35b48000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179208192 unmapped: 61980672 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:09.806568+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179208192 unmapped: 61980672 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:10.806772+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178962432 unmapped: 62226432 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:11.806955+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185188352 unmapped: 56000512 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:12.807131+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2246248 data_alloc: 234881024 data_used: 27807744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:13.807362+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:14.807516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:15.807687+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:16.807836+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:17.808011+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2246248 data_alloc: 234881024 data_used: 27807744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:18.808229+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:19.808433+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:20.808663+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:21.808807+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.740138054s of 12.930875778s, submitted: 5
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191135744 unmapped: 50053120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:22.808955+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2327738 data_alloc: 234881024 data_used: 29638656
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:23.809097+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215b000/0x0/0x4ffc00000, data 0x4d984f6/0x4e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:24.809299+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:25.809468+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:26.809659+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:27.809844+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2340190 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:28.810074+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2159000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:29.810253+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:30.810382+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:31.810562+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:32.810757+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:33.810912+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:34.811129+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:35.811337+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:36.811562+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:37.811896+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:38.812164+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:39.812377+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:40.812692+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:41.812855+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:42.813079+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:43.813273+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:44.813409+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:45.813605+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:46.813838+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:47.814060+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:48.814253+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:49.814458+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:50.814694+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:51.814884+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:52.815082+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:53.815310+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.951963425s of 32.214107513s, submitted: 106
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:54.815501+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:55.815661+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:56.815914+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:57.816227+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:58.816377+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f33c9e5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35b574a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:59.816581+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:00.816952+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:01.817187+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:02.817728+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:03.818290+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:04.818487+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:05.818829+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:06.819144+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:07.819343+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:08.819542+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.455149651s of 15.468193054s, submitted: 3
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:09.819732+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:10.819937+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:11.820210+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:12.820429+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:13.820790+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:14.821018+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:15.821204+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:16.821353+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:17.821656+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:18.821885+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:19.822107+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:20.822319+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:21.822529+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:22.822718+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:23.822875+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:24.823104+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193290240 unmapped: 47898624 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:25.823290+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193306624 unmapped: 47882240 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:26.823572+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:27.823920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:28.824146+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f35968d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35e39a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:29.824344+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:30.824534+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:31.824724+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:32.824888+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:33.825098+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:34.825559+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:35.825754+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:36.826716+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:37.827133+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:38.827718+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:39.828236+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:40.828712+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:41.829134+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:42.829393+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f35cba5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f32604f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:43.829571+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.486000061s of 34.496078491s, submitted: 13
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187080704 unmapped: 54108160 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f481e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:44.829819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:45.830149+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:46.830327+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:47.830689+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2078550 data_alloc: 234881024 data_used: 15892480
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:48.830948+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:49.831156+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:50.831389+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:51.831553+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:52.831706+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f356094a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2078550 data_alloc: 234881024 data_used: 15892480
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:53.832041+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.035298347s of 10.127894402s, submitted: 34
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f343910e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:54.832335+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:55.832572+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:56.832898+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:57.833201+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:58.833530+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:59.833823+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:00.834070+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:01.834429+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:02.834790+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:03.835003+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:04.835222+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:05.835431+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:06.835716+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:07.835956+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:08.836106+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:09.836332+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:10.836495+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:11.836706+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:12.836865+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:13.836987+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:14.837178+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:15.837325+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:16.837492+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:17.837664+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:18.837815+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:19.838025+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:20.838251+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:21.838407+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:22.838645+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:23.838837+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:24.838985+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:25.839142+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:26.839373+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:27.839665+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:28.839858+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:29.840017+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:30.840152+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:31.840404+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:32.840572+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:33.840745+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:34.840926+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:35.841107+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:36.841318+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:37.841605+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:38.841785+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cd0780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f361fb4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3594a960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f35b48780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 45.478294373s of 45.484821320s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:39.841990+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cbb4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f326050e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f36633a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc450c/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [0,0,1])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f33e23e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f32f4f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:40.842179+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc4545/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:41.842372+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:42.842562+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1926394 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:43.842796+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35b80d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:44.843002+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:45.843169+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc4545/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:46.843448+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f35609680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:47.843706+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35608d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176930816 unmapped: 67936256 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f35609a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1928373 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:48.843871+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176939008 unmapped: 67928064 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:49.844035+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176939008 unmapped: 67928064 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:50.844180+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:51.844378+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3600.1 total, 600.0 interval
                                           Cumulative writes: 18K writes, 73K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.02 MB/s
                                           Cumulative WAL: 18K writes, 5879 syncs, 3.22 writes per sync, written: 0.07 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2910 writes, 11K keys, 2910 commit groups, 1.0 writes per commit group, ingest: 12.65 MB, 0.02 MB/s
                                           Interval WAL: 2910 writes, 1118 syncs, 2.60 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread fragmentation_score=0.000897 took=0.000048s
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:52.844589+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2006025 data_alloc: 234881024 data_used: 14098432
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:53.844926+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:54.845141+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:55.845348+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:56.845570+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:57.845804+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2006025 data_alloc: 234881024 data_used: 14098432
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:58.846057+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:59.846277+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.143117905s of 20.674848557s, submitted: 50
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183222272 unmapped: 61644800 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:00.846464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [0,0,0,0,0,0,0,7,29,13])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186425344 unmapped: 58441728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:01.846677+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f35e392c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cd1e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f32604000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186499072 unmapped: 58368000 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f34c8f0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:02.846948+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35968b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f366334a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3580cd20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f3660c000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f36632d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34ca000/0x0/0x4ffc00000, data 0x3a2c5ea/0x3b02000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2112184 data_alloc: 234881024 data_used: 14172160
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:03.847149+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:04.847304+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:05.847491+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3594be00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:06.847678+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:07.847930+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34ca000/0x0/0x4ffc00000, data 0x3a2c5ea/0x3b02000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2111048 data_alloc: 234881024 data_used: 14172160
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:08.848146+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c43c00 session 0x556f32b1f0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186449920 unmapped: 58417152 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:09.848476+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f34391860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f32b1ed20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:10.848686+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:11.848913+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a1000/0x0/0x4ffc00000, data 0x3a555ea/0x3b2b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:12.849086+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:13.849273+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2118024 data_alloc: 234881024 data_used: 14798848
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.214136124s of 13.708144188s, submitted: 134
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:14.849463+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:15.849632+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:16.849849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:17.850079+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:18.850238+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2118200 data_alloc: 234881024 data_used: 14798848
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:19.850424+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:20.850550+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:21.850673+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:22.850825+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187023360 unmapped: 57843712 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:23.851046+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2177924 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.751893997s of 10.002759933s, submitted: 63
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:24.851209+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:25.851398+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41b75ea/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:26.851584+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:27.851892+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:28.852030+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2190656 data_alloc: 234881024 data_used: 15032320
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:29.852160+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41b75ea/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:30.852313+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:31.852531+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d1b000/0x0/0x4ffc00000, data 0x41db5ea/0x42b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:32.852747+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:33.852966+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189432 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:34.853167+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:35.853379+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:36.853538+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:37.853774+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d1b000/0x0/0x4ffc00000, data 0x41db5ea/0x42b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:38.853955+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189432 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:39.854157+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:40.854306+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:41.854547+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.831987381s of 17.884922028s, submitted: 24
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:42.854708+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:43.854886+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:44.855045+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:45.855205+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:46.855382+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:47.855603+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:48.855995+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:49.856191+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:50.856362+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:51.856553+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:52.856725+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:53.856911+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.778059959s of 12.799883842s, submitted: 3
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:54.857069+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:55.857283+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:56.857520+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:57.857739+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:58.857879+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f3662be00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f34c8e780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:59.858078+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:00.858244+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:01.858389+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:02.858540+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:03.858748+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:04.858925+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:05.859137+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:06.859356+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:07.860796+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:08.861124+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:09.861265+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:10.861448+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:11.861658+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.156997681s of 17.189851761s, submitted: 8
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f356092c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3580cd20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187785216 unmapped: 57081856 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:12.861799+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35074b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:13.861990+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2108730 data_alloc: 234881024 data_used: 14180352
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:14.862164+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:15.862382+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:16.862593+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:17.862917+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:18.863085+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2108730 data_alloc: 234881024 data_used: 14180352
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:19.863260+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:20.863449+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:21.863700+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:22.863868+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.129077911s of 11.197218895s, submitted: 23
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f3595ad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3660dc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:23.864112+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1864004 data_alloc: 218103808 data_used: 2670592
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f35c363c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:24.864329+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:25.864511+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:26.864783+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:27.865135+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:28.865412+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:29.865679+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:30.865941+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:31.866241+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489000 session 0x556f3662b0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f33e234a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:32.866569+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:33.866713+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489800 session 0x556f33e23680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:34.866919+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:35.867179+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:36.867367+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:37.867592+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:38.867793+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:39.868010+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.397924423s of 17.578964233s, submitted: 48
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:40.868160+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180068352 unmapped: 64798720 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:41.868411+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180092928 unmapped: 64774144 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:42.868755+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180092928 unmapped: 64774144 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:43.868929+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180199424 unmapped: 64667648 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:44.869114+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:45.869370+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:46.869572+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:47.869843+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:48.870022+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:49.870221+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:50.870464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:51.870642+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:52.870849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:53.871046+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:54.871386+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:55.871592+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:56.871783+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:57.872033+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:58.872173+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:59.872366+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:00.872717+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:01.872964+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:02.873115+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:03.873345+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:04.873556+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:05.873785+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:06.874038+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:07.874339+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:08.874611+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:09.874908+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:10.875188+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:11.875520+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:12.875813+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:13.876018+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f34390000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f34390780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f33e221e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:14.876227+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f3518c1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35982c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.987571716s of 34.327899933s, submitted: 323
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35982c00 session 0x556f3660de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f34391c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179527680 unmapped: 65339392 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35e38d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f35609680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f32e043c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:15.876399+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:16.876714+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:17.876996+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:18.877153+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1906251 data_alloc: 218103808 data_used: 2723840
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:19.877350+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:20.877539+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:21.877770+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:22.877937+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35e394a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179748864 unmapped: 65118208 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:23.878173+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179765248 unmapped: 65101824 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1912553 data_alloc: 218103808 data_used: 2723840
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:24.878366+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:25.878572+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:26.878689+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:27.878877+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:28.878988+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1944777 data_alloc: 218103808 data_used: 7385088
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:29.879198+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:30.879412+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:31.879582+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:32.879773+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:33.879925+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.128231049s of 19.337366104s, submitted: 54
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 181485568 unmapped: 63381504 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1964585 data_alloc: 218103808 data_used: 7409664
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:34.880072+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186572800 unmapped: 58294272 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:35.880256+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186621952 unmapped: 58245120 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:36.880491+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186982400 unmapped: 57884672 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f32f4e000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f35b56d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35e38780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f366321e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:37.880745+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3579d4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: mgrc ms_handle_reset ms_handle_reset con 0x556f377dd000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2285351161
Sep 30 18:55:44 compute-1 ceph-osd[78006]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2285351161,v1:192.168.122.100:6801/2285351161]
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: get_auth_request con 0x556f35982c00 auth_method 0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: mgrc handle_mgr_configure stats_period=5
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f33c9f0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f32fc5c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f32fc41e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f359690e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:38.880975+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2190182 data_alloc: 218103808 data_used: 9437184
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2c0b000/0x0/0x4ffc00000, data 0x42ed576/0x43c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:39.881189+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2c0b000/0x0/0x4ffc00000, data 0x42ed576/0x43c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:40.881379+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:41.881562+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:42.881750+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:43.881952+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2186966 data_alloc: 218103808 data_used: 9441280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:44.882161+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:45.882389+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:46.882687+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:47.882919+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:48.883109+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.168335915s of 14.641261101s, submitted: 161
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f3595ba40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186392576 unmapped: 62668800 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189702 data_alloc: 218103808 data_used: 9441280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:49.883322+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc3000/0x0/0x4ffc00000, data 0x4335576/0x4409000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186400768 unmapped: 62660608 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:50.883528+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192626688 unmapped: 56434688 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:51.883750+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:52.883940+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:53.884108+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2290090 data_alloc: 234881024 data_used: 24068096
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:54.884241+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc0000/0x0/0x4ffc00000, data 0x4338576/0x440c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:55.884405+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:56.884685+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:57.884875+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:58.885059+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2290090 data_alloc: 234881024 data_used: 24068096
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:59.885198+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc0000/0x0/0x4ffc00000, data 0x4338576/0x440c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193404928 unmapped: 55656448 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.603348732s of 11.623636246s, submitted: 4
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:00.885314+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200278016 unmapped: 48783360 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:01.885502+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198844416 unmapped: 50216960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1de5000/0x0/0x4ffc00000, data 0x5113576/0x51e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:02.885674+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198844416 unmapped: 50216960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:03.885876+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198852608 unmapped: 50208768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2405546 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:04.886084+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198852608 unmapped: 50208768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:05.886252+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:06.886476+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1de5000/0x0/0x4ffc00000, data 0x5113576/0x51e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:07.886740+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:08.886929+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:09.887075+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:10.887288+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:11.887490+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198868992 unmapped: 50192384 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:12.887705+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198868992 unmapped: 50192384 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:13.887967+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:14.889046+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:15.889236+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:16.889408+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:17.889663+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:18.889898+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:19.890116+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.138525009s of 19.493839264s, submitted: 153
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:20.900548+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:21.900691+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198901760 unmapped: 50159616 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:22.905022+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198901760 unmapped: 50159616 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35c365a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f359681e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:23.906220+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2406742 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:24.906739+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:25.906892+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:26.907067+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:27.907323+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198918144 unmapped: 50143232 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:28.907456+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198918144 unmapped: 50143232 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2406742 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:29.907675+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198926336 unmapped: 50135040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:30.907816+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198926336 unmapped: 50135040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:31.907951+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.510564804s of 12.544851303s, submitted: 10
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198934528 unmapped: 50126848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:32.908128+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198934528 unmapped: 50126848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:33.908293+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198942720 unmapped: 50118656 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2407582 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:34.908494+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198942720 unmapped: 50118656 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:35.908799+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198950912 unmapped: 50110464 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:36.908946+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198950912 unmapped: 50110464 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:37.909191+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:38.909306+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2407582 data_alloc: 234881024 data_used: 25587712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:39.909475+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:40.909709+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:41.909890+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35e39680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35e394a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198967296 unmapped: 50094080 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:42.910054+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.669682503s of 10.687623978s, submitted: 7
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35cd01e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:43.910209+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2089802 data_alloc: 218103808 data_used: 9441280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:44.910361+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:45.910560+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3a37000/0x0/0x4ffc00000, data 0x34c2567/0x3595000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:46.910754+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:47.910991+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:48.911164+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2089802 data_alloc: 218103808 data_used: 9441280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:49.911287+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3a37000/0x0/0x4ffc00000, data 0x34c2567/0x3595000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f32605860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3660d2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:50.911513+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184254464 unmapped: 64806912 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:51.911665+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32e05860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:52.911827+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:53.911959+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:54.913264+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:55.913497+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:56.913685+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:57.913893+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:58.914046+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:59.914259+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:00.914758+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:01.914940+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:02.915129+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:03.915356+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:04.915555+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:05.915838+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35c363c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f356094a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f339ccf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35e38960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.721506119s of 23.028820038s, submitted: 97
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3660de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:06.915996+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:07.916216+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:08.916410+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:09.916681+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995012 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:10.916899+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413b000/0x0/0x4ffc00000, data 0x2dc14d3/0x2e91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:11.917137+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:12.917295+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:13.917491+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:14.917714+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995012 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413b000/0x0/0x4ffc00000, data 0x2dc14d3/0x2e91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:15.917932+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:16.918136+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.657385826s of 10.751065254s, submitted: 22
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35b561e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:17.918360+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184647680 unmapped: 64413696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:18.918497+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184811520 unmapped: 64249856 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:19.918701+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2072750 data_alloc: 234881024 data_used: 13926400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:20.918914+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:21.919115+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:22.919290+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:23.919487+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:24.919708+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2072750 data_alloc: 234881024 data_used: 13926400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:25.919952+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:26.920152+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:27.920399+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.540970802s of 11.580094337s, submitted: 10
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:28.920544+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191766528 unmapped: 57294848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:29.920684+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192118784 unmapped: 56942592 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2133880 data_alloc: 234881024 data_used: 14221312
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3958000/0x0/0x4ffc00000, data 0x359d4f6/0x366e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:30.920857+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192118784 unmapped: 56942592 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:31.921020+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192495616 unmapped: 56565760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:32.921138+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192495616 unmapped: 56565760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f38ca000/0x0/0x4ffc00000, data 0x36284f6/0x36f9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3509c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3509c800 session 0x556f3660cb40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:33.921231+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35cd1c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f33c9f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192503808 unmapped: 56557568 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3662ad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35b57680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2c00 session 0x556f33c9ed20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f358885a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3580d2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f34c8eb40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:34.921397+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2238722 data_alloc: 234881024 data_used: 14036992
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:35.921659+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d60000/0x0/0x4ffc00000, data 0x4199568/0x426c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:36.921849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:37.922081+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:38.922174+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3595a960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:39.922380+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2237130 data_alloc: 234881024 data_used: 14036992
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41ba568/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:40.922564+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35812c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35812c00 session 0x556f32f485a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:41.922715+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:42.922898+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3579dc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.918482780s of 14.369065285s, submitted: 147
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3518cf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:43.923072+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:44.923220+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2239212 data_alloc: 234881024 data_used: 14045184
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:45.923386+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197615616 unmapped: 51445760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:46.923606+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:47.923877+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:48.924074+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:49.924238+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2312092 data_alloc: 234881024 data_used: 24698880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:50.924439+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3b000/0x0/0x4ffc00000, data 0x41bd578/0x4291000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:51.924611+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:52.924834+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:53.925041+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:54.925218+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2312548 data_alloc: 234881024 data_used: 24711168
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.180825233s of 12.196196556s, submitted: 4
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:55.925417+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206544896 unmapped: 42516480 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a26000/0x0/0x4ffc00000, data 0x54cc578/0x55a0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:56.925569+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205701120 unmapped: 43360256 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:57.925789+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205701120 unmapped: 43360256 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:58.925935+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f196d000/0x0/0x4ffc00000, data 0x557c578/0x5650000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:59.926142+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2470156 data_alloc: 234881024 data_used: 24805376
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f196d000/0x0/0x4ffc00000, data 0x557c578/0x5650000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:00.926338+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:01.926536+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:02.926748+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:03.926929+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:04.927123+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462620 data_alloc: 234881024 data_used: 24809472
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:05.927369+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:06.927559+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:07.927873+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:08.928064+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:09.928263+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462620 data_alloc: 234881024 data_used: 24809472
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:10.928508+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:11.928717+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.743667603s of 17.184471130s, submitted: 187
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:12.928938+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:13.929185+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:14.929349+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462780 data_alloc: 234881024 data_used: 24797184
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:15.929511+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:16.929698+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:17.929924+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:18.930093+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205840384 unmapped: 43220992 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:19.930262+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205840384 unmapped: 43220992 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462780 data_alloc: 234881024 data_used: 24797184
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:20.930420+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205856768 unmapped: 43204608 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:21.930662+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35889c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35969860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:22.930829+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:23.931021+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:24.931219+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462428 data_alloc: 234881024 data_used: 24801280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:25.931452+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:26.931601+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205873152 unmapped: 43188224 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.004239082s of 15.045738220s, submitted: 18
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:27.931817+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:28.931990+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:29.932135+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462932 data_alloc: 234881024 data_used: 24801280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:30.932296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205889536 unmapped: 43171840 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:31.932478+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205889536 unmapped: 43171840 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:32.932673+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:33.932844+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:34.933065+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462932 data_alloc: 234881024 data_used: 24801280
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:35.933231+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205914112 unmapped: 43147264 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:36.933374+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205914112 unmapped: 43147264 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.050238609s of 10.060816765s, submitted: 3
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3579cd20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f32b1f680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:37.933551+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205922304 unmapped: 43139072 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:38.933720+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32fc5e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:39.933873+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2161623 data_alloc: 234881024 data_used: 14028800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:40.934057+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:41.934221+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:42.934380+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:43.934503+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:44.934706+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2161623 data_alloc: 234881024 data_used: 14028800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34390780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f35075680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:45.935575+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:46.935692+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f361fb4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:47.935894+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:48.936099+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:49.936251+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:50.936393+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:51.936537+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:52.936710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:53.936924+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:54.937079+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:55.937222+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:56.937499+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:57.937796+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:58.938384+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:59.938736+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:00.938907+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:01.939114+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:02.939271+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:03.939434+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:04.939654+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:05.939857+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:06.940091+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:07.940289+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:08.940478+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:09.940666+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:10.940839+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:11.941020+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:12.941190+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:13.941355+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:14.941504+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:15.941695+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:16.941849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:17.942066+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:18.942233+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:19.942467+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:20.942761+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:21.943012+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:22.943266+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:23.943468+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:24.943599+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:25.943814+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:26.943939+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:27.944124+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:28.944305+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:29.944527+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:30.944705+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:31.944921+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:32.945281+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:33.945418+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:34.945609+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35c363c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32e05860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3660d2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194371584 unmapped: 54689792 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f35cd01e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 57.805438995s of 58.061119080s, submitted: 87
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35e394a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f3595ba40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f32fc41e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:35.945828+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32fc5c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35969680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:36.945940+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:37.946132+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:38.946271+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:39.946437+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1965949 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:40.946575+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:41.946772+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:42.946912+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193806336 unmapped: 55255040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f3662a5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:43.947075+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193814528 unmapped: 55246848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:44.947208+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1985095 data_alloc: 218103808 data_used: 4546560
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:45.947333+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:46.947513+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:47.947710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:48.947922+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:49.948154+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1986311 data_alloc: 218103808 data_used: 4722688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:50.948350+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:51.948558+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:52.948715+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:53.948890+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:54.949047+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.013702393s of 19.293170929s, submitted: 27
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 195780608 unmapped: 53280768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2070877 data_alloc: 218103808 data_used: 6144000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3c9f000/0x0/0x4ffc00000, data 0x2e3d506/0x2f0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:55.949174+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32f55c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b81c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f35988960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f35b57c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35814400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203431936 unmapped: 45629440 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35814400 session 0x556f35e39c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34390000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32e05c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32fc4d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f33c9e5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:56.949371+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:57.949606+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:58.949868+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2260000/0x0/0x4ffc00000, data 0x36e3516/0x37b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:59.950081+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2146787 data_alloc: 218103808 data_used: 6115328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:00.950280+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2260000/0x0/0x4ffc00000, data 0x36e3516/0x37b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:01.950684+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:02.950920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35811800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35811800 session 0x556f3580c3c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:03.951136+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:04.951326+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f32f53680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2143907 data_alloc: 218103808 data_used: 6115328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:05.951499+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2242000/0x0/0x4ffc00000, data 0x3707516/0x37da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2242000/0x0/0x4ffc00000, data 0x3707516/0x37da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:06.951683+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32604000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.983986855s of 12.415661812s, submitted: 164
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32b1f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:07.951903+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:08.952087+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2219000/0x0/0x4ffc00000, data 0x372e549/0x3803000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:09.952232+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2205390 data_alloc: 234881024 data_used: 14196736
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:10.952382+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:11.952612+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:12.952878+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2219000/0x0/0x4ffc00000, data 0x372e549/0x3803000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:13.953077+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:14.953245+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2216000/0x0/0x4ffc00000, data 0x3731549/0x3806000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2206022 data_alloc: 234881024 data_used: 14200832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:15.953505+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:16.953698+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:17.953935+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2216000/0x0/0x4ffc00000, data 0x3731549/0x3806000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:18.954129+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.719206810s of 11.756697655s, submitted: 8
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202473472 unmapped: 50266112 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:19.954296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202506240 unmapped: 50233344 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2311960 data_alloc: 234881024 data_used: 15716352
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:20.954464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1516000/0x0/0x4ffc00000, data 0x4431549/0x4506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203333632 unmapped: 49405952 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1516000/0x0/0x4ffc00000, data 0x4431549/0x4506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:21.954773+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f3662a1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35b48780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203341824 unmapped: 49397760 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:22.954954+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35609860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:23.955112+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:24.955294+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2098061 data_alloc: 218103808 data_used: 6115328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:25.955486+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f15a6000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:26.955732+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:27.955957+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2a3c000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:28.956157+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2a3c000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:29.956327+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2088449 data_alloc: 218103808 data_used: 6115328
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:30.956516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f361fad20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.614155769s of 12.000268936s, submitted: 163
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3594a960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:31.956740+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34c8f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:32.956891+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:33.957056+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:34.957246+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:35.957456+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:36.957603+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:37.957840+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:38.958009+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:39.958250+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:40.958474+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:41.958683+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:42.958910+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:43.959128+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:44.959324+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:45.959565+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:46.959727+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:47.959924+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:48.960104+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:49.960290+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:50.960526+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:51.960798+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:52.961045+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:53.961276+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:54.961490+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:55.961751+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:56.961985+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:57.962170+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:58.962358+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:59.962561+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:00.962766+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:01.962974+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:02.963161+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:03.963336+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:04.963534+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:05.963685+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:06.963856+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:07.964098+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:08.964359+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:09.964523+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:10.964704+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:11.964935+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:12.965176+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:13.965380+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:14.965611+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:15.965915+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:16.966130+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:17.966401+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:18.966605+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:19.966856+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:20.967142+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:21.967371+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:22.967677+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:23.967912+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:24.968112+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:25.968289+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:26.968452+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:27.968704+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:28.968869+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:29.969067+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:30.969241+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:31.969464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:32.969754+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:33.969993+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:34.970173+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196673536 unmapped: 56066048 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:35.970409+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196673536 unmapped: 56066048 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 65.057006836s of 65.169685364s, submitted: 38
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f3580cb40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:36.970710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:37.970958+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1ebb000/0x0/0x4ffc00000, data 0x28f14d3/0x29c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:38.971125+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:39.971328+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34390960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:40.971552+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2016021 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:41.971727+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1ebb000/0x0/0x4ffc00000, data 0x28f14d3/0x29c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b57e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:42.971909+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35074780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f361fa000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:43.972219+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:44.972415+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:45.972599+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2046651 data_alloc: 218103808 data_used: 6647808
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:46.972900+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:47.973117+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1e93000/0x0/0x4ffc00000, data 0x29184e3/0x29e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:48.973388+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:49.973551+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:50.973835+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2062763 data_alloc: 218103808 data_used: 9084928
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:51.974014+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:52.974273+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1e93000/0x0/0x4ffc00000, data 0x29184e3/0x29e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:53.974487+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200015872 unmapped: 52723712 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:54.974685+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200015872 unmapped: 52723712 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.052993774s of 19.121305466s, submitted: 15
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:55.974927+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202653696 unmapped: 50085888 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2145111 data_alloc: 218103808 data_used: 9428992
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:56.975222+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202661888 unmapped: 50077696 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e08c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e08c00 session 0x556f3518c960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34391c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b49e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3595b680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f366325a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1411000/0x0/0x4ffc00000, data 0x339851c/0x346b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d00400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d00400 session 0x556f3518d4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f33e23e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35e38780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3662b4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:57.976116+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:58.976407+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:59.976569+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:00.976766+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2216636 data_alloc: 218103808 data_used: 9961472
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:01.976987+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f34390780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:02.977223+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:03.977498+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:04.977764+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e0bc00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e0bc00 session 0x556f3660d680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:05.977999+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2216636 data_alloc: 218103808 data_used: 9961472
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35969860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.743741989s of 11.034352303s, submitted: 93
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f361fb4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:06.978145+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203079680 unmapped: 57008128 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:07.978343+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203079680 unmapped: 57008128 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:08.978509+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203931648 unmapped: 56156160 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:09.978724+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206069760 unmapped: 54018048 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:10.978891+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206069760 unmapped: 54018048 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2265890 data_alloc: 234881024 data_used: 16728064
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:11.979096+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:12.979311+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:13.979513+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:14.979725+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:15.979980+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2265890 data_alloc: 234881024 data_used: 16728064
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:16.980166+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:17.980383+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.923741341s of 11.933808327s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:18.980552+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 207986688 unmapped: 52101120 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:19.980689+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f01c8000/0x0/0x4ffc00000, data 0x45df565/0x46b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:20.980843+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2360498 data_alloc: 234881024 data_used: 17391616
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0153000/0x0/0x4ffc00000, data 0x4654565/0x4728000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:21.981006+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:22.981145+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:23.981354+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:24.981511+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:25.981781+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357026 data_alloc: 234881024 data_used: 17395712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0135000/0x0/0x4ffc00000, data 0x4673565/0x4747000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:26.982002+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0135000/0x0/0x4ffc00000, data 0x4673565/0x4747000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:27.982257+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:28.982475+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:29.982754+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:30.982989+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.518098831s of 12.787870407s, submitted: 84
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0133000/0x0/0x4ffc00000, data 0x4674565/0x4748000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357930 data_alloc: 234881024 data_used: 17395712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:31.983138+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:32.983281+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:33.983503+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:34.983737+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:35.984012+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357930 data_alloc: 234881024 data_used: 17395712
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:36.984171+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f012a000/0x0/0x4ffc00000, data 0x467e565/0x4752000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:37.984362+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f012a000/0x0/0x4ffc00000, data 0x467e565/0x4752000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209903616 unmapped: 50184192 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:38.984544+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32f530e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f35cd01e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:39.984689+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:40.984851+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:41.985018+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:42.985221+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:43.985401+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:44.985582+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:45.985764+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:46.985933+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209928192 unmapped: 50159616 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:47.986481+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209928192 unmapped: 50159616 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:48.986932+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:49.987096+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.184453964s of 19.200132370s, submitted: 4
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:50.987590+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:51.987809+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4200.1 total, 600.0 interval
                                           Cumulative writes: 22K writes, 87K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.02 MB/s
                                           Cumulative WAL: 22K writes, 7266 syncs, 3.10 writes per sync, written: 0.08 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3577 writes, 13K keys, 3577 commit groups, 1.0 writes per commit group, ingest: 15.54 MB, 0.03 MB/s
                                           Interval WAL: 3577 writes, 1387 syncs, 2.58 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:52.988167+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:53.988523+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:54.988826+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209944576 unmapped: 50143232 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:55.989075+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3660de00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35b56d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358050 data_alloc: 234881024 data_used: 17403904
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209944576 unmapped: 50143232 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:56.989242+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f3662a1e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:57.989470+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:58.989708+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:59.989894+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:00.990032+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2166067 data_alloc: 218103808 data_used: 9969664
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:01.990182+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:02.990337+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:03.990526+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:04.990710+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f32f4f4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f3660dc20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:05.990842+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2166067 data_alloc: 218103808 data_used: 9969664
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.943166733s of 16.025777817s, submitted: 27
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200450048 unmapped: 59637760 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b56780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:06.991021+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:07.991232+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:08.991419+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:09.991595+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:10.991801+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:11.991954+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:12.992146+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:13.992302+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:14.992604+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:15.992853+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:16.993114+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:17.993365+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:18.993566+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:19.993707+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:20.993876+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:21.994121+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:22.994364+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:23.994516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:24.994716+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:25.994918+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:26.995126+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:27.995363+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:28.995537+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:29.995705+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:30.995849+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:31.996032+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:32.996199+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:33.996404+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:34.996688+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f3518d860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f32f46d20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f35c363c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35cd03c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 29.204212189s of 29.241395950s, submitted: 14
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f35b81e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f3518d0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f32fe7e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f32337e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:35.996866+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35609680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2073535 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:36.997050+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:37.997271+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:38.997496+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9f000/0x0/0x4ffc00000, data 0x2d0b545/0x2ddd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:39.997714+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:40.997902+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2073535 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:41.998078+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:42.998249+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198737920 unmapped: 61349888 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:43.998466+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35b80b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9f000/0x0/0x4ffc00000, data 0x2d0b545/0x2ddd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198746112 unmapped: 61341696 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:44.998739+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198754304 unmapped: 61333504 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:45.998910+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2130217 data_alloc: 218103808 data_used: 10846208
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200564736 unmapped: 59523072 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:46.999075+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:47.999268+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:48.999494+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:49.999771+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:50.999954+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2154233 data_alloc: 234881024 data_used: 14422016
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:52.000158+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:53.000394+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:54.000727+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:55.000942+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.785074234s of 19.939979553s, submitted: 43
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201998336 unmapped: 58089472 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:56.001122+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2276211 data_alloc: 234881024 data_used: 15269888
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209100800 unmapped: 50987008 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:57.001291+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 208961536 unmapped: 51126272 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:58.001524+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 208969728 unmapped: 51118080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1f000/0x0/0x4ffc00000, data 0x3b8a568/0x3c5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:59.001728+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c43400 session 0x556f3660d0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:00.001889+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb93000/0x0/0x4ffc00000, data 0x4c16568/0x4ce9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:01.002065+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2409753 data_alloc: 234881024 data_used: 15323136
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:02.002307+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:03.002549+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb93000/0x0/0x4ffc00000, data 0x4c16568/0x4ce9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210264064 unmapped: 58802176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:04.002722+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210264064 unmapped: 58802176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:05.002917+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210272256 unmapped: 58793984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:06.003096+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2409769 data_alloc: 234881024 data_used: 15323136
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210272256 unmapped: 58793984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:07.003344+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.529235840s of 11.966050148s, submitted: 150
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3580f000 session 0x556f35b56b40
Sep 30 18:55:44 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:08.003610+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:09.003852+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:10.004040+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e08400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 216334336 unmapped: 52731904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:11.004194+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2527582 data_alloc: 251658240 data_used: 32378880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:12.004353+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:13.004576+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:14.004748+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:15.005119+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:16.005322+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2527582 data_alloc: 251658240 data_used: 32378880
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:17.005482+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:18.005742+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222289920 unmapped: 46776320 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:19.005871+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222289920 unmapped: 46776320 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:20.006064+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222314496 unmapped: 46751744 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:21.006281+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.418137550s of 13.440409660s, submitted: 5
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2630538 data_alloc: 251658240 data_used: 33140736
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 224985088 unmapped: 44081152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:22.006559+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 225026048 unmapped: 44040192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:23.006715+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:24.006892+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:25.007064+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:26.007258+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eeddc000/0x0/0x4ffc00000, data 0x59b458b/0x5a88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2640042 data_alloc: 251658240 data_used: 33394688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:27.007413+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:28.007609+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:29.007819+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:30.008917+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:31.009117+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635202 data_alloc: 251658240 data_used: 33394688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:32.009335+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:33.009543+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:34.009725+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:35.009862+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.498695374s of 14.815695763s, submitted: 151
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:36.010012+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635334 data_alloc: 251658240 data_used: 33394688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:37.010144+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35074960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b57a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:38.010344+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd2000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:39.010508+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:40.010712+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:41.010856+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635642 data_alloc: 251658240 data_used: 33394688
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226590720 unmapped: 42475520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:42.011026+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226590720 unmapped: 42475520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:43.011223+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [0,0,0,13])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226631680 unmapped: 42434560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:44.011391+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226754560 unmapped: 42311680 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:45.011533+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:46.011682+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2637714 data_alloc: 251658240 data_used: 33382400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:47.011823+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:48.012021+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:49.012241+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:50.012464+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e08400 session 0x556f35b57e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35102c00 session 0x556f35cba000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226820096 unmapped: 42246144 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:51.012608+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.823111534s of 15.860321999s, submitted: 287
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2302666 data_alloc: 234881024 data_used: 15433728
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219635712 unmapped: 49430528 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35b490e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:52.012912+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:53.013077+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1e000/0x0/0x4ffc00000, data 0x3b8b568/0x3c5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:54.013219+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1e000/0x0/0x4ffc00000, data 0x3b8b568/0x3c5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:55.013371+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:56.013525+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2295426 data_alloc: 234881024 data_used: 15310848
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:57.013716+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:58.013911+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f33c9f0e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38302400 session 0x556f361fbe00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219586560 unmapped: 49479680 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:59.014062+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211197952 unmapped: 57868288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:00.014214+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34c8eb40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4f6/0x229b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:01.014367+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:02.014531+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:03.014705+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:04.014914+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:05.015132+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:06.015345+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:07.015552+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:08.015813+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:09.016050+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:10.016265+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:11.016478+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:12.016732+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:13.016926+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:14.017208+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:15.017452+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:16.017847+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:17.018002+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:18.018219+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:19.018414+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:20.018578+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:21.018789+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:22.019031+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:23.019237+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:24.019425+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:25.019605+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:26.019860+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:27.020023+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:28.020257+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:29.020517+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:30.020712+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:31.020905+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:32.021165+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:33.021314+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:34.021523+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:35.021741+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:36.021907+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:37.022117+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:38.022374+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:39.022551+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:40.022772+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:41.022961+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:42.023202+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:43.023414+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:44.023669+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:45.023834+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:46.024006+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:47.024211+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:48.024504+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _renew_subs
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 57.031288147s of 57.246501923s, submitted: 78
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:49.024675+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211271680 unmapped: 57794560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 145 ms_handle_reset con 0x556f32619c00 session 0x556f35b483c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25de000/0x0/0x4ffc00000, data 0x21cc3db/0x229d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:50.024862+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:51.025048+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:52.025239+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2026892 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:53.025451+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:54.025686+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:55.025910+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211296256 unmapped: 57769984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:56.026110+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211296256 unmapped: 57769984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: get_auth_request con 0x556f32374800 auth_method 0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:57.026277+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2029650 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:58.026466+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f35e39680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f32fc5e00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f32fc41e0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35608f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.830822945s of 10.008426666s, submitted: 80
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35ef1a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:59.026650+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35e392c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f32f55860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 56057856 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f35ef0b40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f3594b680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d3816e/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:00.026803+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d381a7/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:01.027032+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d381a7/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:02.027217+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2131244 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f35cbaf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:03.027424+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:04.027727+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:05.027939+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3580d2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:06.028185+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f34390960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f3579cf00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:07.028348+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2134979 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213295104 unmapped: 55771136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:08.028529+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:09.028724+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:10.028919+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:11.029078+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:12.029269+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34380800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.129595757s of 13.232490540s, submitted: 27
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f34380800 session 0x556f33e23680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2207837 data_alloc: 234881024 data_used: 13172736
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:13.029422+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:14.029611+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:15.029858+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:16.030053+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:17.030202+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2207837 data_alloc: 234881024 data_used: 13172736
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:18.030444+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:19.030643+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 217145344 unmapped: 51920896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:20.030800+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f03fa000/0x0/0x4ffc00000, data 0x3f8d1b7/0x4064000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 221618176 unmapped: 47448064 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:21.030963+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:22.031169+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364737 data_alloc: 234881024 data_used: 15335424
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:23.031394+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:24.031538+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0373000/0x0/0x4ffc00000, data 0x401c1b7/0x40f3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:25.031742+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:26.031923+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.050240517s of 14.504535675s, submitted: 178
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:27.032103+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2362633 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f035a000/0x0/0x4ffc00000, data 0x403b1b7/0x4112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:28.032311+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f32f48000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:29.032449+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:30.032675+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f035a000/0x0/0x4ffc00000, data 0x403b1b7/0x4112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:31.032848+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:32.033003+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364107 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:33.033144+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0350000/0x0/0x4ffc00000, data 0x40451b7/0x411c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:34.033346+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:35.033567+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222887936 unmapped: 46178304 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35b57c20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:36.033746+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222896128 unmapped: 46170112 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0350000/0x0/0x4ffc00000, data 0x40451b7/0x411c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:37.033920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364341 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222896128 unmapped: 46170112 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.793055534s of 10.837422371s, submitted: 11
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f33ec2000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38303800 session 0x556f35c363c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:38.034124+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:39.034318+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:40.034498+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:41.034752+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:42.035006+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364141 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:43.035194+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222920704 unmapped: 46145536 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:44.035391+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:45.035592+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:46.035850+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:47.036017+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364141 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:48.036223+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:49.036407+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:50.036560+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:51.036776+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.185568810s of 14.251283646s, submitted: 19
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [0,0,0,1])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:52.037008+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364477 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:53.037214+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f3594ba40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:54.037430+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:55.037713+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:56.037888+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:57.038071+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364253 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:58.038340+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:59.038553+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:00.038844+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:01.039076+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:02.039285+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364253 data_alloc: 234881024 data_used: 15339520
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:03.039388+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:04.039577+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:05.039739+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:06.039983+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:07.040198+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.336604118s of 15.361575127s, submitted: 6
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2366085 data_alloc: 234881024 data_used: 15327232
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:08.040456+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:09.040659+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:10.040862+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:11.041088+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:12.041252+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2366085 data_alloc: 234881024 data_used: 15327232
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:13.041446+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:14.041688+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222978048 unmapped: 46088192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:15.041879+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222978048 unmapped: 46088192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f33e22960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:16.042017+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35e38780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:17.042209+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2363141 data_alloc: 234881024 data_used: 15327232
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:18.042455+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.912950516s of 10.993530273s, submitted: 31
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f34c8e5a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f3660d680
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:19.042679+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f3518d4a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:20.042927+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c7000/0x0/0x4ffc00000, data 0x21d0135/0x22a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:21.043101+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:22.043337+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2052178 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:23.043479+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:24.043734+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3595b860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:25.043885+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35ef0960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:26.044096+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:27.044397+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:28.044606+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:29.044760+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:30.044923+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:31.045091+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:32.045296+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:33.045487+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:34.045712+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:35.045913+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:36.046131+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:37.046313+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:38.046537+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:39.046763+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:40.046964+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:41.047154+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:42.047338+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.189771652s of 24.371664047s, submitted: 64
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f3662ab40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2138052 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:43.047516+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:44.047699+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:45.047920+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:46.048121+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:47.048314+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f32f46000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2138052 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:48.048585+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:49.048815+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:50.049043+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f35ef0780
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:51.049250+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3662a000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35b56f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:52.049576+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.939822197s of 10.048395157s, submitted: 25
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2140807 data_alloc: 218103808 data_used: 2723840
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:53.049741+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:54.049859+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:55.049982+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:56.050108+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38303800 session 0x556f33c9f2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:57.050266+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2214633 data_alloc: 234881024 data_used: 13324288
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:58.050477+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:59.050668+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:00.050848+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:01.051020+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:02.051177+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2214633 data_alloc: 234881024 data_used: 13324288
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:03.051315+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.502856255s of 10.511325836s, submitted: 2
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:04.051443+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219193344 unmapped: 49872896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:05.051592+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f19a9000/0x0/0x4ffc00000, data 0x3a2f135/0x3b03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219193344 unmapped: 49872896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:06.051839+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:07.052031+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:08.052261+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:09.052454+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:10.052736+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:11.052915+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:12.053127+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:13.053333+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:14.053500+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:15.053672+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:16.053853+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:17.054027+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:18.054284+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:19.054476+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:20.054694+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:21.054863+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:22.055029+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:23.055225+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:24.055411+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:25.055581+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:26.055728+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.446918488s of 23.703081131s, submitted: 110
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33efb800 session 0x556f35ef1a40
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:27.055944+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321197 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:28.056155+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:29.056352+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:30.056520+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:31.056666+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:32.056817+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321197 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:33.056966+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220110848 unmapped: 48955392 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:34.057114+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220119040 unmapped: 48947200 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:35.057329+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220119040 unmapped: 48947200 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:36.057570+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.092041016s of 10.097191811s, submitted: 1
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:37.057773+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:38.058039+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:39.058262+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:40.058386+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:41.058600+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:42.058881+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218882048 unmapped: 50184192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:43.059046+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218882048 unmapped: 50184192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:44.059229+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:45.059395+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:46.059595+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:47.059823+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:48.060020+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3509ec00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.590891838s of 11.599750519s, submitted: 3
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3509ec00 session 0x556f3662b860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f343914a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:49.060167+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:50.060366+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f32f4ed20
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f342f9860
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:51.060534+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3208000/0x0/0x4ffc00000, data 0x21d0135/0x22a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33efb800 session 0x556f3660c960
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:52.060714+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:53.060898+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:54.061117+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:55.061312+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:56.061467+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:57.061741+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:58.061966+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:59.062182+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.613150597s of 10.763535500s, submitted: 47
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f36632f00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:00.062424+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:01.062669+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:02.062841+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:03.063005+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:04.063230+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:05.063459+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:06.063702+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:07.063949+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:08.064179+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:09.064383+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:10.064533+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:11.064746+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:12.064906+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:13.065059+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:14.065289+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:15.065525+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:16.065701+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:17.065857+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:18.066045+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:19.066215+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:20.066339+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:21.066524+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:22.066696+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:23.066842+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:24.067011+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:25.067118+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:26.067324+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:27.067498+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:28.067696+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:29.067866+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:30.068065+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:31.068263+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:32.068433+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32d01c00 session 0x556f32f545a0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32529000 session 0x556f35b48000
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:33.068598+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:34.068824+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33d37400 session 0x556f3595b2c0
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:35.069015+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:36.069217+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:37.069404+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:38.069757+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:39.070002+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:40.070174+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:41.070251+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:42.070428+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:43.070576+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:44.070773+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:45.071038+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:46.071227+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:47.071395+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:48.071613+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:49.071834+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:50.072025+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:51.072224+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:52.072466+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:53.072692+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:54.072923+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:55.073118+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:56.073338+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:57.073532+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:58.073749+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:59.073913+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:00.074081+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:01.074284+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:02.074506+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:03.074706+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:04.074884+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:05.075080+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:06.075308+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:07.075472+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:08.075722+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:09.075889+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:10.076062+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:11.076221+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}'
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}'
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 55369728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}'
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}'
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:12.076367+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212942848 unmapped: 56123392 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 18:55:44 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:13.076500+0000)
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 18:55:44 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 18:55:44 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 18:55:44 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 56098816 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 18:55:44 compute-1 ceph-osd[78006]: do_command 'log dump' '{prefix=log dump}'
Sep 30 18:55:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:44 compute-1 sshd-session[311539]: Invalid user pavan from 161.132.50.17 port 34310
Sep 30 18:55:44 compute-1 sshd-session[311539]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:55:44 compute-1 sshd-session[311539]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:55:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:44.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:44 compute-1 sudo[311630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:55:44 compute-1 sudo[311630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:55:44 compute-1 sudo[311630]: pam_unix(sudo:session): session closed for user root
Sep 30 18:55:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr versions"} v 0)
Sep 30 18:55:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2364943987' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 18:55:45 compute-1 ceph-mon[75484]: from='client.27469 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3374520111' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 18:55:45 compute-1 ceph-mon[75484]: from='client.27477 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2364943987' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 18:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:45 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon stat"} v 0)
Sep 30 18:55:45 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4222235085' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 18:55:45 compute-1 crontab[311839]: (root) LIST (root)
Sep 30 18:55:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:46.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:46 compute-1 ceph-mon[75484]: from='client.27485 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4222235085' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 18:55:46 compute-1 ceph-mon[75484]: from='client.27493 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:46 compute-1 ceph-mon[75484]: pgmap v2328: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "node ls"} v 0)
Sep 30 18:55:46 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2913047677' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 18:55:46 compute-1 sshd-session[311539]: Failed password for invalid user pavan from 161.132.50.17 port 34310 ssh2
Sep 30 18:55:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:46.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:46 compute-1 nova_compute[238822]: 2025-09-30 18:55:46.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:46 compute-1 nova_compute[238822]: 2025-09-30 18:55:46.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:46 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Sep 30 18:55:46 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/682867562' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Sep 30 18:55:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3173099720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.27499 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.27507 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2913047677' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.27511 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/682867562' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3173099720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Sep 30 18:55:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1775577588' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Sep 30 18:55:47 compute-1 sshd-session[311539]: Received disconnect from 161.132.50.17 port 34310:11: Bye Bye [preauth]
Sep 30 18:55:47 compute-1 sshd-session[311539]: Disconnected from invalid user pavan 161.132.50.17 port 34310 [preauth]
Sep 30 18:55:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Sep 30 18:55:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3224008869' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 18:55:47 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Sep 30 18:55:47 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1305591873' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 18:55:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:48.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Sep 30 18:55:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/768375213' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Sep 30 18:55:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2191495725' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 18:55:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.27519 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1775577588' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3224008869' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1305591873' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: pgmap v2329: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/768375213' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2191495725' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 18:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Sep 30 18:55:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/50140661' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 18:55:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:48.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Sep 30 18:55:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2754314125' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 18:55:48 compute-1 sshd-session[312029]: Invalid user art from 49.49.32.245 port 50928
Sep 30 18:55:48 compute-1 sshd-session[312029]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:55:48 compute-1 sshd-session[312029]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:55:48 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 18:55:48 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Sep 30 18:55:48 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4178499829' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 18:55:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd metadata"} v 0)
Sep 30 18:55:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3593172710' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 18:55:49 compute-1 systemd[1]: Started Hostname Service.
Sep 30 18:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/50140661' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 18:55:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2754314125' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 18:55:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4178499829' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 18:55:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3593172710' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: ERROR   18:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: ERROR   18:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: ERROR   18:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: ERROR   18:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: ERROR   18:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:55:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:55:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Sep 30 18:55:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2675601677' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 18:55:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd utilization"} v 0)
Sep 30 18:55:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4061543543' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 18:55:49 compute-1 sshd-session[312029]: Failed password for invalid user art from 49.49.32.245 port 50928 ssh2
Sep 30 18:55:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Sep 30 18:55:49 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2041355492' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 18:55:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:50.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Sep 30 18:55:50 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2717751240' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 18:55:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2675601677' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 18:55:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4061543543' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 18:55:50 compute-1 ceph-mon[75484]: pgmap v2330: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2041355492' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 18:55:50 compute-1 ceph-mon[75484]: from='client.27575 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2717751240' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 18:55:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:50.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:51 compute-1 sshd-session[312029]: Received disconnect from 49.49.32.245 port 50928:11: Bye Bye [preauth]
Sep 30 18:55:51 compute-1 sshd-session[312029]: Disconnected from invalid user art 49.49.32.245 port 50928 [preauth]
Sep 30 18:55:51 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "quorum_status"} v 0)
Sep 30 18:55:51 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/201153571' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 18:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:51 compute-1 ceph-mon[75484]: from='client.27583 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:51 compute-1 ceph-mon[75484]: from='client.27587 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/201153571' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:51 compute-1 nova_compute[238822]: 2025-09-30 18:55:51.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:55:51 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "versions"} v 0)
Sep 30 18:55:51 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/572766346' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 18:55:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:52.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Sep 30 18:55:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/768507186' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:55:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:52 compute-1 ceph-mon[75484]: from='client.27595 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:52 compute-1 ceph-mon[75484]: from='client.27603 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/572766346' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 18:55:52 compute-1 ceph-mon[75484]: pgmap v2331: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:52 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/768507186' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:55:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:55:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:52.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:52 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Sep 30 18:55:52 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2866731057' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:53 compute-1 unix_chkpwd[312858]: password check failed for user (root)
Sep 30 18:55:53 compute-1 sshd-session[312765]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:55:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='client.27611 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='client.27619 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2866731057' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='client.27627 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:55:53 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:55:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config dump"} v 0)
Sep 30 18:55:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1041952759' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 18:55:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.003000081s ======
Sep 30 18:55:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:54.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Sep 30 18:55:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Sep 30 18:55:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2106535500' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 18:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:55:54.446 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:55:54.446 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:55:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:55:54.446 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:55:54 compute-1 ceph-mon[75484]: from='client.27641 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:55:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1041952759' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 18:55:54 compute-1 ceph-mon[75484]: pgmap v2332: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:55:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2106535500' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 18:55:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:54.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df"} v 0)
Sep 30 18:55:54 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2244064885' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 18:55:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs dump"} v 0)
Sep 30 18:55:55 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2174272043' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 18:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:55 compute-1 ceph-mon[75484]: from='client.27651 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2244064885' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 18:55:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2174272043' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 18:55:55 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs ls"} v 0)
Sep 30 18:55:55 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/265782300' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 18:55:55 compute-1 sshd-session[312765]: Failed password for root from 8.243.64.201 port 36700 ssh2
Sep 30 18:55:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:56.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:56 compute-1 sshd-session[312765]: Received disconnect from 8.243.64.201 port 36700:11: Bye Bye [preauth]
Sep 30 18:55:56 compute-1 sshd-session[312765]: Disconnected from authenticating user root 8.243.64.201 port 36700 [preauth]
Sep 30 18:55:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:55:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:56.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:55:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/265782300' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 18:55:56 compute-1 ceph-mon[75484]: pgmap v2333: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mds stat"} v 0)
Sep 30 18:55:56 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/828699626' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:55:56 compute-1 nova_compute[238822]: 2025-09-30 18:55:56.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:55:56 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump"} v 0)
Sep 30 18:55:56 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1557476775' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 18:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:57 compute-1 ceph-mon[75484]: from='client.27671 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/828699626' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 18:55:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1557476775' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 18:55:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Sep 30 18:55:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3639933351' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 18:55:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:55:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:55:58.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:55:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:55:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:55:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:55:58.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:55:58 compute-1 ceph-mon[75484]: from='client.27683 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3677158116' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:55:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3677158116' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:55:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3639933351' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 18:55:58 compute-1 ceph-mon[75484]: pgmap v2334: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:55:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:55:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:55:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:55:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Sep 30 18:55:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2497781289' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Sep 30 18:55:59 compute-1 podman[313786]: 2025-09-30 18:55:59.745884844 +0000 UTC m=+0.086747829 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:55:59 compute-1 podman[313792]: 2025-09-30 18:55:59.793516352 +0000 UTC m=+0.119102717 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:55:59 compute-1 ceph-mon[75484]: from='client.27699 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:59 compute-1 ceph-mon[75484]: from='client.27703 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:55:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/249781741' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Sep 30 18:55:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2497781289' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Sep 30 18:56:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:00.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:00.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:00 compute-1 ceph-mon[75484]: pgmap v2335: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:00 compute-1 ceph-mon[75484]: from='client.19048 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:00 compute-1 ceph-mon[75484]: from='client.27717 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:01 compute-1 ovs-appctl[314407]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 18:56:01 compute-1 ovs-appctl[314421]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 18:56:01 compute-1 ovs-appctl[314445]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 18:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd stat"} v 0)
Sep 30 18:56:01 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/674035074' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Sep 30 18:56:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:01 compute-1 nova_compute[238822]: 2025-09-30 18:56:01.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:01 compute-1 nova_compute[238822]: 2025-09-30 18:56:01.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3242465448' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Sep 30 18:56:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/674035074' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Sep 30 18:56:01 compute-1 ceph-mon[75484]: from='client.27727 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:01 compute-1 ceph-mon[75484]: pgmap v2336: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:02.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:02.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:02 compute-1 podman[314911]: 2025-09-30 18:56:02.554361304 +0000 UTC m=+0.099375588 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 18:56:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "status"} v 0)
Sep 30 18:56:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3044538243' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:56:02 compute-1 ceph-mon[75484]: from='client.19056 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3044538243' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:56:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Sep 30 18:56:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2190048606' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Sep 30 18:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2190048606' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Sep 30 18:56:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/399368709' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:03 compute-1 ceph-mon[75484]: from='client.19062 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:03 compute-1 ceph-mon[75484]: pgmap v2337: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:04.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Sep 30 18:56:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4021390343' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:04.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:04 compute-1 sudo[315533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:56:04 compute-1 sudo[315533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:04 compute-1 sudo[315533]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:04 compute-1 sshd-session[315094]: Invalid user oracle from 192.210.160.141 port 43254
Sep 30 18:56:04 compute-1 sshd-session[315094]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:56:04 compute-1 sshd-session[315094]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:56:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Sep 30 18:56:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2632073261' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Sep 30 18:56:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4021390343' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2632073261' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.097 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.098 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.098 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.098 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.099 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Sep 30 18:56:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2157336329' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.619 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.619 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.619 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.619 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:56:05 compute-1 nova_compute[238822]: 2025-09-30 18:56:05.620 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:56:05 compute-1 podman[249638]: time="2025-09-30T18:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:56:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:56:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8369 "" "Go-http-client/1.1"
Sep 30 18:56:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Sep 30 18:56:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2821228972' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2157336329' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:05 compute-1 ceph-mon[75484]: pgmap v2338: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2821228972' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:06.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:56:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2724288933' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.129 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:56:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.302 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.303 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:56:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.340 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.341 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4359MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.341 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.342 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:56:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:06.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:06 compute-1 nova_compute[238822]: 2025-09-30 18:56:06.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Sep 30 18:56:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1638146919' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2724288933' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:07 compute-1 ceph-mon[75484]: from='client.27767 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1638146919' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Sep 30 18:56:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2011182209' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:07 compute-1 podman[315829]: 2025-09-30 18:56:07.576541588 +0000 UTC m=+0.102944694 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:56:07 compute-1 sshd-session[315094]: Failed password for invalid user oracle from 192.210.160.141 port 43254 ssh2
Sep 30 18:56:07 compute-1 podman[315832]: 2025-09-30 18:56:07.585042676 +0000 UTC m=+0.101254938 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:56:07 compute-1 podman[315831]: 2025-09-30 18:56:07.593253327 +0000 UTC m=+0.109415778 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.609 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.609 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:56:06 up  4:33,  0 user,  load average: 1.91, 0.79, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.721 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.807 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.808 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.828 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.851 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 18:56:07 compute-1 nova_compute[238822]: 2025-09-30 18:56:07.884 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:56:07 compute-1 sshd-session[315094]: Connection closed by invalid user oracle 192.210.160.141 port 43254 [preauth]
Sep 30 18:56:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:08.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2011182209' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:56:08 compute-1 ceph-mon[75484]: from='client.19076 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:08 compute-1 ceph-mon[75484]: pgmap v2339: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Sep 30 18:56:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/227151964' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:56:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4084825127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:08 compute-1 nova_compute[238822]: 2025-09-30 18:56:08.330 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:56:08 compute-1 nova_compute[238822]: 2025-09-30 18:56:08.337 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:56:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:08.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:08 compute-1 nova_compute[238822]: 2025-09-30 18:56:08.858 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:56:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/227151964' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4084825127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3085274472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:09 compute-1 ceph-mon[75484]: from='client.27789 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Sep 30 18:56:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4202097990' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:09 compute-1 nova_compute[238822]: 2025-09-30 18:56:09.487 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:56:09 compute-1 nova_compute[238822]: 2025-09-30 18:56:09.487 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:56:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Sep 30 18:56:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3940667694' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:10.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:10 compute-1 ceph-mon[75484]: from='client.27793 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4202097990' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3940667694' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:10 compute-1 ceph-mon[75484]: pgmap v2340: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:10.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Sep 30 18:56:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/794127967' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-mon[75484]: from='client.27805 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1600127837' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-mon[75484]: from='client.27809 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/794127967' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Sep 30 18:56:11 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3392779860' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:11 compute-1 nova_compute[238822]: 2025-09-30 18:56:11.442 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:11 compute-1 nova_compute[238822]: 2025-09-30 18:56:11.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:11 compute-1 nova_compute[238822]: 2025-09-30 18:56:11.971 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:12.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3392779860' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:12 compute-1 ceph-mon[75484]: from='client.27821 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:12 compute-1 ceph-mon[75484]: pgmap v2341: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:12.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Sep 30 18:56:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487946791' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:13 compute-1 ceph-mon[75484]: from='client.27825 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1487946791' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Sep 30 18:56:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/514176730' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:13 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 18:56:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:14.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/514176730' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Sep 30 18:56:14 compute-1 ceph-mon[75484]: pgmap v2342: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:14.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:15 compute-1 ceph-mon[75484]: pgmap v2343: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:16.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:16 compute-1 systemd[1]: Starting Time & Date Service...
Sep 30 18:56:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:16 compute-1 systemd[1]: Started Time & Date Service.
Sep 30 18:56:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:16.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:16 compute-1 nova_compute[238822]: 2025-09-30 18:56:16.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:16 compute-1 nova_compute[238822]: 2025-09-30 18:56:16.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:16 compute-1 nova_compute[238822]: 2025-09-30 18:56:16.581 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:17 compute-1 ceph-mon[75484]: pgmap v2344: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:18 compute-1 nova_compute[238822]: 2025-09-30 18:56:18.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:18.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:18.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: ERROR   18:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: ERROR   18:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: ERROR   18:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: ERROR   18:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: ERROR   18:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:56:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:56:19 compute-1 ceph-mon[75484]: pgmap v2345: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:20.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:20.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:21 compute-1 nova_compute[238822]: 2025-09-30 18:56:21.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:21 compute-1 ceph-mon[75484]: pgmap v2346: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:22.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:22.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:56:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:23 compute-1 ceph-mon[75484]: pgmap v2347: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:24.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:24.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:24 compute-1 sudo[316737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:56:24 compute-1 sudo[316737]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:24 compute-1 sudo[316737]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:25 compute-1 ceph-mon[75484]: pgmap v2348: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:26.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:26.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:26 compute-1 nova_compute[238822]: 2025-09-30 18:56:26.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:26 compute-1 nova_compute[238822]: 2025-09-30 18:56:26.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:26 compute-1 sudo[316764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:56:26 compute-1 sudo[316764]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:26 compute-1 sudo[316764]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:27 compute-1 sudo[316790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:56:27 compute-1 sudo[316790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:27 compute-1 sudo[316790]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:56:27 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:56:28 compute-1 nova_compute[238822]: 2025-09-30 18:56:28.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:56:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:28.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:28.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:28 compute-1 ceph-mon[75484]: pgmap v2349: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 516 B/s rd, 0 op/s
Sep 30 18:56:28 compute-1 ceph-mon[75484]: pgmap v2350: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 621 B/s rd, 0 op/s
Sep 30 18:56:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:29 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3409039046' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 18:56:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:30.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:30.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:30 compute-1 podman[316852]: 2025-09-30 18:56:30.549185564 +0000 UTC m=+0.084161470 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:56:30 compute-1 podman[316851]: 2025-09-30 18:56:30.597125761 +0000 UTC m=+0.131868910 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 18:56:30 compute-1 unix_chkpwd[316898]: password check failed for user (root)
Sep 30 18:56:30 compute-1 sshd-session[316848]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.19090 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.19098 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: pgmap v2351: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 621 B/s rd, 0 op/s
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2863328886' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.19106 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3506331521' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.19114 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:30 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2756469354' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 18:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:31 compute-1 nova_compute[238822]: 2025-09-30 18:56:31.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:31 compute-1 nova_compute[238822]: 2025-09-30 18:56:31.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2133251732' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 18:56:31 compute-1 ceph-mon[75484]: from='client.19126 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:31 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1442968888' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:56:31 compute-1 ceph-mon[75484]: from='client.19134 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:31 compute-1 ceph-mon[75484]: pgmap v2352: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 621 B/s rd, 0 op/s
Sep 30 18:56:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:32.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:32.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:32 compute-1 sudo[316901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:56:32 compute-1 sudo[316901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:32 compute-1 sudo[316901]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:32 compute-1 podman[316925]: 2025-09-30 18:56:32.771517621 +0000 UTC m=+0.064708968 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 18:56:32 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Sep 30 18:56:32 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/321118810' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 18:56:32 compute-1 sshd-session[316848]: Failed password for root from 192.210.160.141 port 43572 ssh2
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3709307474' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2790081530' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/448556682' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:56:32 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/321118810' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 18:56:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/308662736' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 18:56:33 compute-1 ceph-mon[75484]: from='client.19156 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1596230471' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 18:56:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/30544332' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 18:56:33 compute-1 ceph-mon[75484]: pgmap v2353: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 621 B/s rd, 0 op/s
Sep 30 18:56:33 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4002619199' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 18:56:34 compute-1 sshd-session[316848]: Connection closed by authenticating user root 192.210.160.141 port 43572 [preauth]
Sep 30 18:56:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:34.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:34.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1357592871' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 18:56:35 compute-1 ceph-mon[75484]: from='client.19176 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:35 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3931900365' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 18:56:35 compute-1 ceph-mon[75484]: from='client.19184 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:35 compute-1 podman[249638]: time="2025-09-30T18:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:56:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:56:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8370 "" "Go-http-client/1.1"
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.19192 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3094080132' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.19200 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3483878862' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 18:56:36 compute-1 ceph-mon[75484]: pgmap v2354: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 621 B/s rd, 0 op/s
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.19204 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3075206590' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 18:56:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:36.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:36.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:36 compute-1 nova_compute[238822]: 2025-09-30 18:56:36.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:36 compute-1 nova_compute[238822]: 2025-09-30 18:56:36.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.19212 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2855355197' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2696771638' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2696771638' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.19220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:37 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2970473258' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 18:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:38 compute-1 ceph-mon[75484]: from='client.19236 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:56:38 compute-1 ceph-mon[75484]: from='client.19244 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:38 compute-1 ceph-mon[75484]: pgmap v2355: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 614 B/s rd, 0 op/s
Sep 30 18:56:38 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3860624165' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 18:56:38 compute-1 ceph-mon[75484]: from='client.19252 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:38.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:38.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:38 compute-1 podman[316951]: 2025-09-30 18:56:38.559649175 +0000 UTC m=+0.097398905 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:56:38 compute-1 podman[316952]: 2025-09-30 18:56:38.567793304 +0000 UTC m=+0.098389342 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350)
Sep 30 18:56:38 compute-1 podman[316953]: 2025-09-30 18:56:38.574987137 +0000 UTC m=+0.106179141 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Sep 30 18:56:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3707973157' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 18:56:39 compute-1 ceph-mon[75484]: from='client.19260 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2357832739' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Sep 30 18:56:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1477497579' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 18:56:39 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2775808468' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Sep 30 18:56:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:40.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3112684509' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3280220950' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 18:56:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3556323966' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 18:56:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/519856706' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 18:56:40 compute-1 ceph-mon[75484]: pgmap v2356: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:40 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/233717800' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:40.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1452845119' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 18:56:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/74545643' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 18:56:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/710692951' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 18:56:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3063329717' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:41 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3305935773' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 18:56:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:41 compute-1 nova_compute[238822]: 2025-09-30 18:56:41.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:42.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:42 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1044403615' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 18:56:42 compute-1 ceph-mon[75484]: from='client.19320 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:42 compute-1 ceph-mon[75484]: from='client.19324 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:42 compute-1 ceph-mon[75484]: pgmap v2357: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:42 compute-1 ceph-mon[75484]: from='client.19328 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:42.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:43 compute-1 ceph-mon[75484]: from='client.19332 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:43 compute-1 ceph-mon[75484]: from='client.19340 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1252293978' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 18:56:43 compute-1 ceph-mon[75484]: from='client.19348 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:43 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2223569940' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 18:56:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:44.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='client.19356 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='client.19360 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1698645699' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: pgmap v2358: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2598323234' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:56:44 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:56:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:44.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:45 compute-1 sudo[317020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:56:45 compute-1 sudo[317020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:56:45 compute-1 sudo[317020]: pam_unix(sudo:session): session closed for user root
Sep 30 18:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:45 compute-1 ceph-mon[75484]: from='client.19368 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:45 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 18:56:45 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 18:56:45 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4267779373' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 18:56:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:46.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:46 compute-1 ceph-mon[75484]: from='client.19392 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1425410265' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 18:56:46 compute-1 ceph-mon[75484]: pgmap v2359: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:46 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2660580480' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 18:56:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:46 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 18:56:46 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 18:56:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:46.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:46 compute-1 nova_compute[238822]: 2025-09-30 18:56:46.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3345172031' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 18:56:47 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1197372166' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 18:56:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:48.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:56:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:48.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:56:48 compute-1 ceph-mon[75484]: from='client.19412 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:48 compute-1 ceph-mon[75484]: pgmap v2360: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1908694273' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 18:56:48 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1935332311' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 18:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: ERROR   18:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: ERROR   18:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: ERROR   18:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: ERROR   18:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: ERROR   18:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:56:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:56:49 compute-1 ceph-mon[75484]: from='client.19424 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:49 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2459810991' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 18:56:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:50.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:50.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:50 compute-1 ceph-mon[75484]: from='client.19432 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:50 compute-1 ceph-mon[75484]: pgmap v2361: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:50 compute-1 ceph-mon[75484]: from='client.19436 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1595750658' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Sep 30 18:56:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3978494488' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Sep 30 18:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:51 compute-1 sshd-session[317054]: Invalid user xiaoli from 161.132.50.17 port 32814
Sep 30 18:56:51 compute-1 sshd-session[317054]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:56:51 compute-1 sshd-session[317054]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:56:51 compute-1 nova_compute[238822]: 2025-09-30 18:56:51.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:51 compute-1 nova_compute[238822]: 2025-09-30 18:56:51.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:51 compute-1 ceph-mon[75484]: from='client.19448 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:51 compute-1 ceph-mon[75484]: from='client.19452 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:51 compute-1 ceph-mon[75484]: pgmap v2362: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1274110017' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Sep 30 18:56:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:52.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:52.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:53 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2651078752' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Sep 30 18:56:53 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:56:53 compute-1 ceph-mon[75484]: from='client.27865 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:53 compute-1 unix_chkpwd[317061]: password check failed for user (root)
Sep 30 18:56:53 compute-1 sshd-session[317057]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 18:56:53 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "status"} v 0)
Sep 30 18:56:53 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/575942045' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:56:53 compute-1 sshd-session[317054]: Failed password for invalid user xiaoli from 161.132.50.17 port 32814 ssh2
Sep 30 18:56:53 compute-1 sshd-session[317054]: Received disconnect from 161.132.50.17 port 32814:11: Bye Bye [preauth]
Sep 30 18:56:53 compute-1 sshd-session[317054]: Disconnected from invalid user xiaoli 161.132.50.17 port 32814 [preauth]
Sep 30 18:56:54 compute-1 ceph-mon[75484]: from='client.19464 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 18:56:54 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/575942045' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 18:56:54 compute-1 ceph-mon[75484]: pgmap v2363: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:56:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:54.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:56:54.447 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:56:54.447 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:56:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:56:54.448 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:56:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:54.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/21004290' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Sep 30 18:56:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2954428967' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:55 compute-1 ceph-mon[75484]: from='client.19480 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:55 compute-1 sshd-session[317057]: Failed password for root from 49.49.32.245 port 46130 ssh2
Sep 30 18:56:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/855589697' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:56:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3105430286' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Sep 30 18:56:56 compute-1 ceph-mon[75484]: pgmap v2364: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2686002295' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:56.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:56 compute-1 sshd-session[317057]: Received disconnect from 49.49.32.245 port 46130:11: Bye Bye [preauth]
Sep 30 18:56:56 compute-1 sshd-session[317057]: Disconnected from authenticating user root 49.49.32.245 port 46130 [preauth]
Sep 30 18:56:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:56:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:56.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:56:56 compute-1 nova_compute[238822]: 2025-09-30 18:56:56.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:56:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3693464253' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:57 compute-1 ceph-mon[75484]: from='client.19500 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:57 compute-1 unix_chkpwd[317069]: password check failed for user (root)
Sep 30 18:56:57 compute-1 sshd-session[317065]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:56:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Sep 30 18:56:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1576960322' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4208053099' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Sep 30 18:56:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2810434453' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:56:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2810434453' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:56:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1576960322' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Sep 30 18:56:58 compute-1 ceph-mon[75484]: pgmap v2365: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:56:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:56:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:56:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:56:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:56:58.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:56:58 compute-1 unix_chkpwd[317073]: password check failed for user (root)
Sep 30 18:56:58 compute-1 sshd-session[317070]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 18:56:59 compute-1 ceph-mon[75484]: from='client.19520 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2905651480' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Sep 30 18:56:59 compute-1 ceph-mon[75484]: from='client.19528 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:56:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:56:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:56:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:56:59 compute-1 sshd-session[317065]: Failed password for root from 192.210.160.141 port 51640 ssh2
Sep 30 18:57:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:00.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:00 compute-1 ceph-mon[75484]: from='client.19532 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:57:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1113064749' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Sep 30 18:57:00 compute-1 ceph-mon[75484]: pgmap v2366: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3341084094' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Sep 30 18:57:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:00.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:00 compute-1 sudo[309658]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:00 compute-1 sshd-session[309656]: Received disconnect from 192.168.122.10 port 54348:11: disconnected by user
Sep 30 18:57:00 compute-1 sshd-session[309656]: Disconnected from user zuul 192.168.122.10 port 54348
Sep 30 18:57:00 compute-1 sshd-session[309651]: pam_unix(sshd:session): session closed for user zuul
Sep 30 18:57:00 compute-1 systemd[1]: session-61.scope: Deactivated successfully.
Sep 30 18:57:00 compute-1 systemd[1]: session-61.scope: Consumed 2min 57.386s CPU time, 753.9M memory peak, read 205.4M from disk, written 95.0M to disk.
Sep 30 18:57:00 compute-1 systemd-logind[789]: Session 61 logged out. Waiting for processes to exit.
Sep 30 18:57:00 compute-1 systemd-logind[789]: Removed session 61.
Sep 30 18:57:00 compute-1 sshd-session[317065]: Connection closed by authenticating user root 192.210.160.141 port 51640 [preauth]
Sep 30 18:57:00 compute-1 sshd-session[317070]: Failed password for root from 8.243.64.201 port 59352 ssh2
Sep 30 18:57:00 compute-1 podman[317077]: 2025-09-30 18:57:00.836070262 +0000 UTC m=+0.104769743 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:57:00 compute-1 sshd-session[317078]: Accepted publickey for zuul from 192.168.122.10 port 45334 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 18:57:00 compute-1 systemd-logind[789]: New session 62 of user zuul.
Sep 30 18:57:00 compute-1 podman[317076]: 2025-09-30 18:57:00.904661983 +0000 UTC m=+0.174444523 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 18:57:00 compute-1 systemd[1]: Started Session 62 of User zuul.
Sep 30 18:57:00 compute-1 sshd-session[317078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 18:57:01 compute-1 sudo[317130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2025-09-30-ogtumon.tar.xz
Sep 30 18:57:01 compute-1 sudo[317130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:57:01 compute-1 sudo[317130]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:01 compute-1 sshd-session[317129]: Received disconnect from 192.168.122.10 port 45334:11: disconnected by user
Sep 30 18:57:01 compute-1 sshd-session[317129]: Disconnected from user zuul 192.168.122.10 port 45334
Sep 30 18:57:01 compute-1 sshd-session[317078]: pam_unix(sshd:session): session closed for user zuul
Sep 30 18:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:01 compute-1 systemd[1]: session-62.scope: Deactivated successfully.
Sep 30 18:57:01 compute-1 systemd-logind[789]: Session 62 logged out. Waiting for processes to exit.
Sep 30 18:57:01 compute-1 systemd-logind[789]: Removed session 62.
Sep 30 18:57:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:01 compute-1 sshd-session[317156]: Accepted publickey for zuul from 192.168.122.10 port 45342 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 18:57:01 compute-1 systemd-logind[789]: New session 63 of user zuul.
Sep 30 18:57:01 compute-1 systemd[1]: Started Session 63 of User zuul.
Sep 30 18:57:01 compute-1 ceph-mon[75484]: from='client.19544 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:57:01 compute-1 ceph-mon[75484]: from='client.19548 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:57:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2381867134' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 18:57:01 compute-1 sshd-session[317156]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 18:57:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/525131232' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Sep 30 18:57:01 compute-1 sshd-session[317070]: Received disconnect from 8.243.64.201 port 59352:11: Bye Bye [preauth]
Sep 30 18:57:01 compute-1 sshd-session[317070]: Disconnected from authenticating user root 8.243.64.201 port 59352 [preauth]
Sep 30 18:57:01 compute-1 sudo[317160]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Sep 30 18:57:01 compute-1 sudo[317160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 18:57:01 compute-1 sudo[317160]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:01 compute-1 sshd-session[317159]: Received disconnect from 192.168.122.10 port 45342:11: disconnected by user
Sep 30 18:57:01 compute-1 sshd-session[317159]: Disconnected from user zuul 192.168.122.10 port 45342
Sep 30 18:57:01 compute-1 sshd-session[317156]: pam_unix(sshd:session): session closed for user zuul
Sep 30 18:57:01 compute-1 systemd[1]: session-63.scope: Deactivated successfully.
Sep 30 18:57:01 compute-1 systemd-logind[789]: Session 63 logged out. Waiting for processes to exit.
Sep 30 18:57:01 compute-1 systemd-logind[789]: Removed session 63.
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:01 compute-1 nova_compute[238822]: 2025-09-30 18:57:01.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:02.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:02 compute-1 ceph-mon[75484]: from='client.19560 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:57:02 compute-1 ceph-mon[75484]: pgmap v2367: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:02.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:03 compute-1 ceph-mon[75484]: from='client.19564 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 18:57:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/718661898' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Sep 30 18:57:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/450687706' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Sep 30 18:57:03 compute-1 podman[317187]: 2025-09-30 18:57:03.548081844 +0000 UTC m=+0.081642053 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Sep 30 18:57:04 compute-1 nova_compute[238822]: 2025-09-30 18:57:04.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:04.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:04 compute-1 ceph-mon[75484]: pgmap v2368: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:05 compute-1 nova_compute[238822]: 2025-09-30 18:57:05.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:05 compute-1 nova_compute[238822]: 2025-09-30 18:57:05.056 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:57:05 compute-1 sudo[317210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:57:05 compute-1 sudo[317210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:05 compute-1 sudo[317210]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:05 compute-1 podman[249638]: time="2025-09-30T18:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:57:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:57:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8371 "" "Go-http-client/1.1"
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:06.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.579 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:06 compute-1 nova_compute[238822]: 2025-09-30 18:57:06.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:06 compute-1 ceph-mon[75484]: pgmap v2369: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:57:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3153050889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.078 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.328 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.330 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:57:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.365 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.366 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4523MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.367 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:57:07 compute-1 nova_compute[238822]: 2025-09-30 18:57:07.368 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:57:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3153050889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:57:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:57:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:57:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:08 compute-1 nova_compute[238822]: 2025-09-30 18:57:08.443 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:57:08 compute-1 nova_compute[238822]: 2025-09-30 18:57:08.444 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:57:07 up  4:34,  0 user,  load average: 1.39, 0.87, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:57:08 compute-1 nova_compute[238822]: 2025-09-30 18:57:08.475 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:57:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:08.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:08 compute-1 ceph-mon[75484]: pgmap v2370: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:57:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/769490593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:08 compute-1 nova_compute[238822]: 2025-09-30 18:57:08.980 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:57:08 compute-1 nova_compute[238822]: 2025-09-30 18:57:08.986 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:09 compute-1 nova_compute[238822]: 2025-09-30 18:57:09.494 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:57:09 compute-1 podman[317284]: 2025-09-30 18:57:09.563662621 +0000 UTC m=+0.089421232 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:57:09 compute-1 podman[317285]: 2025-09-30 18:57:09.568087869 +0000 UTC m=+0.087539570 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:57:09 compute-1 podman[317286]: 2025-09-30 18:57:09.581548911 +0000 UTC m=+0.093633784 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 18:57:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/769490593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:10 compute-1 nova_compute[238822]: 2025-09-30 18:57:10.006 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:57:10 compute-1 nova_compute[238822]: 2025-09-30 18:57:10.007 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.639s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:57:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:10.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:10 compute-1 ceph-mon[75484]: pgmap v2371: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1594932922' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.007 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:11 compute-1 sshd-session[317340]: Invalid user admin from 103.153.190.105 port 39176
Sep 30 18:57:11 compute-1 sshd-session[317340]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:57:11 compute-1 sshd-session[317340]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:11 compute-1 nova_compute[238822]: 2025-09-30 18:57:11.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:11 compute-1 ceph-mon[75484]: pgmap v2372: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:12 compute-1 nova_compute[238822]: 2025-09-30 18:57:12.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:12.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:12.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/285837713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:13 compute-1 ceph-mon[75484]: pgmap v2373: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:13 compute-1 sshd-session[317340]: Failed password for invalid user admin from 103.153.190.105 port 39176 ssh2
Sep 30 18:57:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:14.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:14.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:14 compute-1 sshd-session[317340]: Received disconnect from 103.153.190.105 port 39176:11: Bye Bye [preauth]
Sep 30 18:57:14 compute-1 sshd-session[317340]: Disconnected from invalid user admin 103.153.190.105 port 39176 [preauth]
Sep 30 18:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:16.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.456931) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636457014, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 2154, "num_deletes": 256, "total_data_size": 4393903, "memory_usage": 4483232, "flush_reason": "Manual Compaction"}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636471423, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 2856896, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62546, "largest_seqno": 64695, "table_properties": {"data_size": 2847979, "index_size": 5218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 22323, "raw_average_key_size": 21, "raw_value_size": 2828651, "raw_average_value_size": 2699, "num_data_blocks": 227, "num_entries": 1048, "num_filter_entries": 1048, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258489, "oldest_key_time": 1759258489, "file_creation_time": 1759258636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 14797 microseconds, and 7995 cpu microseconds.
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.471725) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 2856896 bytes OK
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.471827) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.473764) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.473789) EVENT_LOG_v1 {"time_micros": 1759258636473780, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.473823) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 4383645, prev total WAL file size 4383645, number of live WAL files 2.
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.476529) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323630' seq:72057594037927935, type:22 .. '6C6F676D0032353132' seq:0, type:0; will stop at (end)
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(2789KB)], [129(10MB)]
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636476592, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 14118136, "oldest_snapshot_seqno": -1}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8378 keys, 13982122 bytes, temperature: kUnknown
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636555515, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 13982122, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13930218, "index_size": 29839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20997, "raw_key_size": 221738, "raw_average_key_size": 26, "raw_value_size": 13784766, "raw_average_value_size": 1645, "num_data_blocks": 1162, "num_entries": 8378, "num_filter_entries": 8378, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.555967) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 13982122 bytes
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.557381) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.4 rd, 176.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.7 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(9.8) write-amplify(4.9) OK, records in: 8904, records dropped: 526 output_compression: NoCompression
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.557411) EVENT_LOG_v1 {"time_micros": 1759258636557397, "job": 82, "event": "compaction_finished", "compaction_time_micros": 79120, "compaction_time_cpu_micros": 52495, "output_level": 6, "num_output_files": 1, "total_output_size": 13982122, "num_input_records": 8904, "num_output_records": 8378, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636558489, "job": 82, "event": "table_file_deletion", "file_number": 131}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258636562199, "job": 82, "event": "table_file_deletion", "file_number": 129}
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.476394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.562327) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.562339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.562343) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.562348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:16.562353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:16.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:16 compute-1 nova_compute[238822]: 2025-09-30 18:57:16.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:16 compute-1 ceph-mon[75484]: pgmap v2374: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:17 compute-1 nova_compute[238822]: 2025-09-30 18:57:17.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:17 compute-1 ceph-mon[75484]: pgmap v2375: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:18.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:18.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: ERROR   18:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: ERROR   18:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: ERROR   18:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: ERROR   18:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: ERROR   18:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:57:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:57:20 compute-1 nova_compute[238822]: 2025-09-30 18:57:20.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:20.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:20.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:20 compute-1 ceph-mon[75484]: pgmap v2376: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:21 compute-1 nova_compute[238822]: 2025-09-30 18:57:21.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:57:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:22.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:57:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:22.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:22 compute-1 ceph-mon[75484]: pgmap v2377: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:24.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:24 compute-1 ceph-mon[75484]: pgmap v2378: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:24.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:24 compute-1 unix_chkpwd[317358]: password check failed for user (root)
Sep 30 18:57:24 compute-1 sshd-session[317354]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:57:25 compute-1 sudo[317360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:57:25 compute-1 sudo[317360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:25 compute-1 sudo[317360]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:26.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:26.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:26 compute-1 sshd-session[317354]: Failed password for root from 192.210.160.141 port 58154 ssh2
Sep 30 18:57:26 compute-1 ceph-mon[75484]: pgmap v2379: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:26 compute-1 nova_compute[238822]: 2025-09-30 18:57:26.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:27 compute-1 sshd-session[317354]: Connection closed by authenticating user root 192.210.160.141 port 58154 [preauth]
Sep 30 18:57:28 compute-1 nova_compute[238822]: 2025-09-30 18:57:28.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:57:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:28.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:28.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:28 compute-1 ceph-mon[75484]: pgmap v2380: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:30 compute-1 ceph-mon[75484]: pgmap v2381: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:30.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:30.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:31 compute-1 podman[317392]: 2025-09-30 18:57:31.56135203 +0000 UTC m=+0.091801005 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:57:31 compute-1 podman[317391]: 2025-09-30 18:57:31.616474529 +0000 UTC m=+0.149738430 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller)
Sep 30 18:57:31 compute-1 nova_compute[238822]: 2025-09-30 18:57:31.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:31 compute-1 nova_compute[238822]: 2025-09-30 18:57:31.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:32.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:32.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:32 compute-1 sudo[317442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:57:32 compute-1 sudo[317442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:32 compute-1 sudo[317442]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:32 compute-1 ceph-mon[75484]: pgmap v2382: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:32 compute-1 sudo[317467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:57:32 compute-1 sudo[317467]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:33 compute-1 sudo[317467]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:57:33 compute-1 ceph-mon[75484]: pgmap v2383: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 779 B/s rd, 0 op/s
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:57:33 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:57:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:34.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:34 compute-1 podman[317524]: 2025-09-30 18:57:34.552587684 +0000 UTC m=+0.087163160 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 18:57:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:34.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:35 compute-1 podman[249638]: time="2025-09-30T18:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:57:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:57:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8371 "" "Go-http-client/1.1"
Sep 30 18:57:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:36.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:36.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:36 compute-1 ceph-mon[75484]: pgmap v2384: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 519 B/s rd, 0 op/s
Sep 30 18:57:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/143161206' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:57:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/143161206' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:36 compute-1 nova_compute[238822]: 2025-09-30 18:57:36.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:57:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:38.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:38 compute-1 sudo[317548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:57:38 compute-1 sudo[317548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:38 compute-1 sudo[317548]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:38.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:38 compute-1 ceph-mon[75484]: pgmap v2385: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 519 B/s rd, 0 op/s
Sep 30 18:57:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:57:38 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:40.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:40 compute-1 podman[317575]: 2025-09-30 18:57:40.567086512 +0000 UTC m=+0.099625515 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 18:57:40 compute-1 podman[317576]: 2025-09-30 18:57:40.580731569 +0000 UTC m=+0.104412834 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350)
Sep 30 18:57:40 compute-1 podman[317577]: 2025-09-30 18:57:40.585081535 +0000 UTC m=+0.108807471 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 18:57:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:40.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:40 compute-1 ceph-mon[75484]: pgmap v2386: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 779 B/s rd, 0 op/s
Sep 30 18:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:41 compute-1 nova_compute[238822]: 2025-09-30 18:57:41.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:42 compute-1 ceph-mon[75484]: pgmap v2387: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 519 B/s rd, 0 op/s
Sep 30 18:57:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:57:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:42.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:57:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:42.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:44.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:44.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:44 compute-1 ceph-mon[75484]: pgmap v2388: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 779 B/s rd, 0 op/s
Sep 30 18:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:45 compute-1 sudo[317640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:57:45 compute-1 sudo[317640]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:57:45 compute-1 sudo[317640]: pam_unix(sudo:session): session closed for user root
Sep 30 18:57:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:46.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:46.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:46 compute-1 ceph-mon[75484]: pgmap v2389: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:46 compute-1 nova_compute[238822]: 2025-09-30 18:57:46.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:48.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:48.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:48 compute-1 ceph-mon[75484]: pgmap v2390: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: ERROR   18:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: ERROR   18:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: ERROR   18:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: ERROR   18:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: ERROR   18:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:57:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:57:50 compute-1 ceph-mon[75484]: pgmap v2391: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:50.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:50.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:51 compute-1 sshd-session[317669]: Invalid user boss from 192.210.160.141 port 44588
Sep 30 18:57:51 compute-1 sshd-session[317669]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:57:51 compute-1 sshd-session[317669]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:57:51 compute-1 nova_compute[238822]: 2025-09-30 18:57:51.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:57:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:52.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:57:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:52.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:57:52 compute-1 ceph-mon[75484]: pgmap v2392: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:54 compute-1 sshd-session[317669]: Failed password for invalid user boss from 192.210.160.141 port 44588 ssh2
Sep 30 18:57:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:54.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:57:54.449 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:57:54.450 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:57:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:57:54.450 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:57:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:54.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:54 compute-1 ceph-mon[75484]: pgmap v2393: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:55 compute-1 sshd-session[317669]: Connection closed by invalid user boss 192.210.160.141 port 44588 [preauth]
Sep 30 18:57:56 compute-1 ceph-mon[75484]: pgmap v2394: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:56.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.418303) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676418387, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 642, "num_deletes": 251, "total_data_size": 1146520, "memory_usage": 1159144, "flush_reason": "Manual Compaction"}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676427199, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 751663, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 64700, "largest_seqno": 65337, "table_properties": {"data_size": 748406, "index_size": 1164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7633, "raw_average_key_size": 19, "raw_value_size": 741873, "raw_average_value_size": 1892, "num_data_blocks": 51, "num_entries": 392, "num_filter_entries": 392, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258636, "oldest_key_time": 1759258636, "file_creation_time": 1759258676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 8963 microseconds, and 5262 cpu microseconds.
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.427273) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 751663 bytes OK
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.427305) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.429206) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.429230) EVENT_LOG_v1 {"time_micros": 1759258676429223, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.429254) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 1142951, prev total WAL file size 1142951, number of live WAL files 2.
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.430142) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(734KB)], [132(13MB)]
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676430187, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 14733785, "oldest_snapshot_seqno": -1}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8255 keys, 12744476 bytes, temperature: kUnknown
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676479241, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 12744476, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12694322, "index_size": 28409, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20677, "raw_key_size": 219883, "raw_average_key_size": 26, "raw_value_size": 12551844, "raw_average_value_size": 1520, "num_data_blocks": 1095, "num_entries": 8255, "num_filter_entries": 8255, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.479569) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 12744476 bytes
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.481129) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 299.7 rd, 259.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 13.3 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(36.6) write-amplify(17.0) OK, records in: 8770, records dropped: 515 output_compression: NoCompression
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.481161) EVENT_LOG_v1 {"time_micros": 1759258676481147, "job": 84, "event": "compaction_finished", "compaction_time_micros": 49158, "compaction_time_cpu_micros": 26763, "output_level": 6, "num_output_files": 1, "total_output_size": 12744476, "num_input_records": 8770, "num_output_records": 8255, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676481580, "job": 84, "event": "table_file_deletion", "file_number": 134}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258676486608, "job": 84, "event": "table_file_deletion", "file_number": 132}
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.430020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.486822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.486828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.486831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.486834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-18:57:56.486837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 18:57:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:56.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:56 compute-1 nova_compute[238822]: 2025-09-30 18:57:56.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/906702194' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:57:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/906702194' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:57:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:57:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:57:58.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:57:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:57:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:57:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:57:58.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:57:58 compute-1 ceph-mon[75484]: pgmap v2395: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:57:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:57:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:57:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:57:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:57:59 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4800.0 total, 600.0 interval
                                           Cumulative writes: 12K writes, 65K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.14 GB, 0.03 MB/s
                                           Cumulative WAL: 12K writes, 12K syncs, 1.00 writes per sync, written: 0.14 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1563 writes, 7993 keys, 1563 commit groups, 1.0 writes per commit group, ingest: 16.50 MB, 0.03 MB/s
                                           Interval WAL: 1563 writes, 1563 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    128.1      0.67              0.36        42    0.016       0      0       0.0       0.0
                                             L6      1/0   12.15 MB   0.0      0.5     0.1      0.4       0.5      0.0       0.0   5.5    174.9    151.2      3.12              1.72        41    0.076    271K    22K       0.0       0.0
                                            Sum      1/0   12.15 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.5    143.9    147.1      3.79              2.07        83    0.046    271K    22K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.4    153.5    155.6      0.54              0.34        12    0.045     50K   3072       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.5      0.0       0.0   0.0    174.9    151.2      3.12              1.72        41    0.076    271K    22K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    128.5      0.67              0.36        41    0.016       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.084, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.54 GB write, 0.12 MB/s write, 0.53 GB read, 0.11 MB/s read, 3.8 seconds
                                           Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55f2aa20b350#2 capacity: 304.00 MB usage: 55.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.00053 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3329,53.25 MB,17.5161%) FilterBlock(83,756.92 KB,0.243152%) IndexBlock(83,1.13 MB,0.371471%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Sep 30 18:57:59 compute-1 sshd-session[317682]: Invalid user info from 161.132.50.17 port 55792
Sep 30 18:57:59 compute-1 sshd-session[317682]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:57:59 compute-1 sshd-session[317682]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:58:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:00.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:00 compute-1 ceph-mon[75484]: pgmap v2396: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:01 compute-1 unix_chkpwd[317688]: password check failed for user (root)
Sep 30 18:58:01 compute-1 sshd-session[317684]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 18:58:01 compute-1 nova_compute[238822]: 2025-09-30 18:58:01.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:02 compute-1 sshd-session[317682]: Failed password for invalid user info from 161.132.50.17 port 55792 ssh2
Sep 30 18:58:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:02.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:02 compute-1 podman[317691]: 2025-09-30 18:58:02.547245499 +0000 UTC m=+0.085669557 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 18:58:02 compute-1 podman[317690]: 2025-09-30 18:58:02.592047609 +0000 UTC m=+0.134056914 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 18:58:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 18:58:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:02.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 18:58:02 compute-1 ceph-mon[75484]: pgmap v2397: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:03 compute-1 sshd-session[317684]: Failed password for root from 49.49.32.245 port 41330 ssh2
Sep 30 18:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:04 compute-1 nova_compute[238822]: 2025-09-30 18:58:04.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:04 compute-1 sshd-session[317682]: Received disconnect from 161.132.50.17 port 55792:11: Bye Bye [preauth]
Sep 30 18:58:04 compute-1 sshd-session[317682]: Disconnected from invalid user info 161.132.50.17 port 55792 [preauth]
Sep 30 18:58:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:04.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:04 compute-1 sshd-session[317684]: Received disconnect from 49.49.32.245 port 41330:11: Bye Bye [preauth]
Sep 30 18:58:04 compute-1 sshd-session[317684]: Disconnected from authenticating user root 49.49.32.245 port 41330 [preauth]
Sep 30 18:58:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:04.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:04 compute-1 ceph-mon[75484]: pgmap v2398: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:05 compute-1 sudo[317744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:58:05 compute-1 sudo[317744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:05 compute-1 sudo[317744]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:05 compute-1 podman[317767]: 2025-09-30 18:58:05.554179459 +0000 UTC m=+0.082979204 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 18:58:05 compute-1 podman[249638]: time="2025-09-30T18:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:58:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:58:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:58:06 compute-1 nova_compute[238822]: 2025-09-30 18:58:06.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:06 compute-1 nova_compute[238822]: 2025-09-30 18:58:06.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:58:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:06.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:06 compute-1 sshd-session[317788]: Invalid user rishabh from 8.243.64.201 port 52444
Sep 30 18:58:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:06.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:06 compute-1 sshd-session[317788]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:58:06 compute-1 sshd-session[317788]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:58:06 compute-1 ceph-mon[75484]: pgmap v2399: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:06 compute-1 nova_compute[238822]: 2025-09-30 18:58:06.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:07 compute-1 nova_compute[238822]: 2025-09-30 18:58:07.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-crash-compute-1[76085]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Sep 30 18:58:07 compute-1 nova_compute[238822]: 2025-09-30 18:58:07.574 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:08.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:08 compute-1 sshd-session[317788]: Failed password for invalid user rishabh from 8.243.64.201 port 52444 ssh2
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.573 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:58:08 compute-1 nova_compute[238822]: 2025-09-30 18:58:08.574 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:58:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:08.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:08 compute-1 ceph-mon[75484]: pgmap v2400: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:58:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4036824627' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.054 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.291 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.293 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.337 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.338 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4580MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.338 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:58:09 compute-1 nova_compute[238822]: 2025-09-30 18:58:09.338 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:58:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:09 compute-1 sshd-session[317788]: Received disconnect from 8.243.64.201 port 52444:11: Bye Bye [preauth]
Sep 30 18:58:09 compute-1 sshd-session[317788]: Disconnected from invalid user rishabh 8.243.64.201 port 52444 [preauth]
Sep 30 18:58:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4036824627' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:10.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:10.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:10 compute-1 ceph-mon[75484]: pgmap v2401: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:11 compute-1 nova_compute[238822]: 2025-09-30 18:58:11.168 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:58:11 compute-1 nova_compute[238822]: 2025-09-30 18:58:11.169 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:58:09 up  4:35,  0 user,  load average: 0.54, 0.72, 0.59\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:11 compute-1 nova_compute[238822]: 2025-09-30 18:58:11.516 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:58:11 compute-1 podman[317819]: 2025-09-30 18:58:11.588085873 +0000 UTC m=+0.111827623 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid)
Sep 30 18:58:11 compute-1 podman[317821]: 2025-09-30 18:58:11.607263381 +0000 UTC m=+0.117744323 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 18:58:11 compute-1 podman[317820]: 2025-09-30 18:58:11.623464009 +0000 UTC m=+0.140502568 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 18:58:11 compute-1 ceph-mon[75484]: pgmap v2402: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:58:11 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2058300386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:11 compute-1 nova_compute[238822]: 2025-09-30 18:58:11.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.026 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.033 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:58:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:12.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:12 compute-1 nova_compute[238822]: 2025-09-30 18:58:12.549 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:58:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:12.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2058300386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/374295777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:13 compute-1 nova_compute[238822]: 2025-09-30 18:58:13.073 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:58:13 compute-1 nova_compute[238822]: 2025-09-30 18:58:13.073 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.735s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:13 compute-1 ceph-mon[75484]: pgmap v2403: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:14.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:14.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3859101451' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:15 compute-1 ceph-mon[75484]: pgmap v2404: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:16.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:16.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:17 compute-1 nova_compute[238822]: 2025-09-30 18:58:17.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:18.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:18 compute-1 ceph-mon[75484]: pgmap v2405: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:19 compute-1 nova_compute[238822]: 2025-09-30 18:58:19.074 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:19 compute-1 nova_compute[238822]: 2025-09-30 18:58:19.075 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: ERROR   18:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: ERROR   18:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: ERROR   18:58:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: ERROR   18:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: ERROR   18:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:58:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:58:19 compute-1 unix_chkpwd[317911]: password check failed for user (root)
Sep 30 18:58:19 compute-1 sshd-session[317907]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:58:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:20.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:20.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:20 compute-1 ceph-mon[75484]: pgmap v2406: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:21 compute-1 sshd-session[317907]: Failed password for root from 192.210.160.141 port 53382 ssh2
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:22 compute-1 nova_compute[238822]: 2025-09-30 18:58:22.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:22.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:22.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:22 compute-1 sshd-session[317907]: Connection closed by authenticating user root 192.210.160.141 port 53382 [preauth]
Sep 30 18:58:22 compute-1 ceph-mon[75484]: pgmap v2407: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:58:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 18:58:22 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4800.1 total, 600.0 interval
                                           Cumulative writes: 24K writes, 95K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.02 MB/s
                                           Cumulative WAL: 24K writes, 8145 syncs, 3.04 writes per sync, written: 0.09 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2237 writes, 8429 keys, 2237 commit groups, 1.0 writes per commit group, ingest: 9.64 MB, 0.02 MB/s
                                           Interval WAL: 2237 writes, 879 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 18:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:24.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:24.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:24 compute-1 ceph-mon[75484]: pgmap v2408: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:25 compute-1 sudo[317918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:58:25 compute-1 sudo[317918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:25 compute-1 sudo[317918]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:26.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:26.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:26 compute-1 ceph-mon[75484]: pgmap v2409: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:27 compute-1 nova_compute[238822]: 2025-09-30 18:58:27.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:28.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:28.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:28 compute-1 ceph-mon[75484]: pgmap v2410: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:29 compute-1 nova_compute[238822]: 2025-09-30 18:58:29.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:30.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:30.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:30 compute-1 ceph-mon[75484]: pgmap v2411: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:32 compute-1 nova_compute[238822]: 2025-09-30 18:58:32.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:32.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:32.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:32 compute-1 ceph-mon[75484]: pgmap v2412: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:33 compute-1 podman[317952]: 2025-09-30 18:58:33.568121974 +0000 UTC m=+0.100283172 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 18:58:33 compute-1 podman[317951]: 2025-09-30 18:58:33.624305182 +0000 UTC m=+0.158146135 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 18:58:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:34.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:34.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:34 compute-1 ceph-mon[75484]: pgmap v2413: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:35 compute-1 podman[249638]: time="2025-09-30T18:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:58:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:58:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:58:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:36 compute-1 podman[318005]: 2025-09-30 18:58:36.559056951 +0000 UTC m=+0.096582651 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 18:58:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:36.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:36 compute-1 ceph-mon[75484]: pgmap v2414: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4263349243' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:58:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/4263349243' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:58:37 compute-1 nova_compute[238822]: 2025-09-30 18:58:37.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:58:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:38.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:38 compute-1 sudo[318027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:58:38 compute-1 sudo[318027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:38 compute-1 sudo[318027]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:38 compute-1 sudo[318052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:58:38 compute-1 sudo[318052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:38.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:38 compute-1 ceph-mon[75484]: pgmap v2415: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:39 compute-1 sudo[318052]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:58:39 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:58:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:40.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:40.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:40 compute-1 ceph-mon[75484]: pgmap v2416: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 777 B/s rd, 0 op/s
Sep 30 18:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:41 compute-1 ceph-mon[75484]: pgmap v2417: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 518 B/s rd, 0 op/s
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:42 compute-1 nova_compute[238822]: 2025-09-30 18:58:42.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:42.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:42 compute-1 podman[318112]: 2025-09-30 18:58:42.543080233 +0000 UTC m=+0.090791735 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid)
Sep 30 18:58:42 compute-1 podman[318113]: 2025-09-30 18:58:42.570544025 +0000 UTC m=+0.103212820 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:58:42 compute-1 podman[318114]: 2025-09-30 18:58:42.586384263 +0000 UTC m=+0.112196373 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 18:58:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:42.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:44.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:44 compute-1 ceph-mon[75484]: pgmap v2418: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 518 B/s rd, 0 op/s
Sep 30 18:58:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:44.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:44 compute-1 sudo[318173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:58:44 compute-1 sudo[318173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:44 compute-1 sudo[318173]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:58:45 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:58:45 compute-1 sudo[318201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:58:45 compute-1 sudo[318201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:58:45 compute-1 sudo[318201]: pam_unix(sudo:session): session closed for user root
Sep 30 18:58:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:46.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:46 compute-1 ceph-mon[75484]: pgmap v2419: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 518 B/s rd, 0 op/s
Sep 30 18:58:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:46.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:46 compute-1 unix_chkpwd[318227]: password check failed for user (root)
Sep 30 18:58:46 compute-1 sshd-session[318198]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:47 compute-1 nova_compute[238822]: 2025-09-30 18:58:47.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:48 compute-1 ceph-mon[75484]: pgmap v2420: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 518 B/s rd, 0 op/s
Sep 30 18:58:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:48.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:49 compute-1 sshd-session[318198]: Failed password for root from 192.210.160.141 port 37338 ssh2
Sep 30 18:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: ERROR   18:58:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: ERROR   18:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: ERROR   18:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: ERROR   18:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: ERROR   18:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:58:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:58:49 compute-1 sshd-session[318198]: Connection closed by authenticating user root 192.210.160.141 port 37338 [preauth]
Sep 30 18:58:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:50 compute-1 ceph-mon[75484]: pgmap v2421: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 777 B/s rd, 0 op/s
Sep 30 18:58:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:50.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:52 compute-1 nova_compute[238822]: 2025-09-30 18:58:52.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:52 compute-1 nova_compute[238822]: 2025-09-30 18:58:52.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:52.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:52 compute-1 ceph-mon[75484]: pgmap v2422: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:58:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:52.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:58:54.451 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:58:54.452 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:58:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:58:54.452 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:58:54 compute-1 ceph-mon[75484]: pgmap v2423: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:54.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:58:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:56.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:58:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:56 compute-1 ceph-mon[75484]: pgmap v2424: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:58:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:56.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:58:57 compute-1 nova_compute[238822]: 2025-09-30 18:58:57.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3983763451' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:58:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3983763451' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:58:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:58:58.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:58:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:58:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:58:58.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:58:58 compute-1 ceph-mon[75484]: pgmap v2425: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:58:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:58:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:58:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:00.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:00 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:59:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:00.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:59:00 compute-1 ceph-mon[75484]: pgmap v2426: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:01 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:02 compute-1 nova_compute[238822]: 2025-09-30 18:59:02.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:02.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:02 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:02.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:02 compute-1 ceph-mon[75484]: pgmap v2427: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:03 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:04.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:04 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:04 compute-1 podman[318248]: 2025-09-30 18:59:04.570909036 +0000 UTC m=+0.094821803 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:59:04 compute-1 podman[318247]: 2025-09-30 18:59:04.677502867 +0000 UTC m=+0.203332566 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 18:59:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:04.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:04 compute-1 ceph-mon[75484]: pgmap v2428: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:05 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:05 compute-1 podman[249638]: time="2025-09-30T18:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:59:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:59:05 compute-1 podman[249638]: @ - - [30/Sep/2025:18:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8367 "" "Go-http-client/1.1"
Sep 30 18:59:05 compute-1 sudo[318300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:59:05 compute-1 sudo[318300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:05 compute-1 sudo[318300]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:06 compute-1 nova_compute[238822]: 2025-09-30 18:59:06.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:06 compute-1 nova_compute[238822]: 2025-09-30 18:59:06.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:06 compute-1 nova_compute[238822]: 2025-09-30 18:59:06.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 18:59:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:06.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:06 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:06.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:06 compute-1 ceph-mon[75484]: pgmap v2429: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:07 compute-1 nova_compute[238822]: 2025-09-30 18:59:07.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:07 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:07 compute-1 podman[318327]: 2025-09-30 18:59:07.54800399 +0000 UTC m=+0.090566058 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 18:59:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:59:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:08.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:08 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:08.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:08 compute-1 ceph-mon[75484]: pgmap v2430: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:09 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.578 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.579 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 18:59:09 compute-1 nova_compute[238822]: 2025-09-30 18:59:09.579 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:59:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:59:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3989022614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.056 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.269 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.270 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:59:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.315 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.316 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4562MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.317 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:59:10 compute-1 nova_compute[238822]: 2025-09-30 18:59:10.317 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:59:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:10.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:10 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:10 compute-1 sshd-session[318369]: Invalid user ubuntu from 161.132.50.17 port 43824
Sep 30 18:59:10 compute-1 sshd-session[318369]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:59:10 compute-1 sshd-session[318369]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 18:59:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:10.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:10 compute-1 ceph-mon[75484]: pgmap v2431: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3989022614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:11 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:11 compute-1 nova_compute[238822]: 2025-09-30 18:59:11.371 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 18:59:11 compute-1 nova_compute[238822]: 2025-09-30 18:59:11.372 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:59:10 up  4:36,  0 user,  load average: 0.31, 0.62, 0.56\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 18:59:11 compute-1 nova_compute[238822]: 2025-09-30 18:59:11.385 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 18:59:11 compute-1 sshd-session[318373]: Invalid user public from 49.49.32.245 port 36534
Sep 30 18:59:11 compute-1 sshd-session[318373]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:59:11 compute-1 sshd-session[318373]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 18:59:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 18:59:11 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3138273325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3138273325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:11 compute-1 nova_compute[238822]: 2025-09-30 18:59:11.851 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 18:59:11 compute-1 nova_compute[238822]: 2025-09-30 18:59:11.858 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 18:59:12 compute-1 nova_compute[238822]: 2025-09-30 18:59:12.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:12 compute-1 sshd-session[318369]: Failed password for invalid user ubuntu from 161.132.50.17 port 43824 ssh2
Sep 30 18:59:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:12 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:12 compute-1 nova_compute[238822]: 2025-09-30 18:59:12.367 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 18:59:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:12.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:12.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:12 compute-1 ceph-mon[75484]: pgmap v2432: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/635788212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:12 compute-1 nova_compute[238822]: 2025-09-30 18:59:12.882 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 18:59:12 compute-1 nova_compute[238822]: 2025-09-30 18:59:12.883 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.566s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:13 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:13 compute-1 podman[318404]: 2025-09-30 18:59:13.568997242 +0000 UTC m=+0.104644618 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 18:59:13 compute-1 podman[318406]: 2025-09-30 18:59:13.573908285 +0000 UTC m=+0.099980543 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 18:59:13 compute-1 podman[318405]: 2025-09-30 18:59:13.608188232 +0000 UTC m=+0.138750081 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Sep 30 18:59:13 compute-1 sshd-session[318373]: Failed password for invalid user public from 49.49.32.245 port 36534 ssh2
Sep 30 18:59:13 compute-1 sshd-session[318400]: Invalid user public from 192.210.160.141 port 56592
Sep 30 18:59:13 compute-1 sshd-session[318400]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:59:13 compute-1 sshd-session[318400]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 18:59:13 compute-1 sshd-session[318369]: Received disconnect from 161.132.50.17 port 43824:11: Bye Bye [preauth]
Sep 30 18:59:13 compute-1 sshd-session[318369]: Disconnected from invalid user ubuntu 161.132.50.17 port 43824 [preauth]
Sep 30 18:59:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:14 compute-1 sshd-session[318462]: Invalid user sharp from 8.243.64.201 port 36360
Sep 30 18:59:14 compute-1 sshd-session[318462]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:59:14 compute-1 sshd-session[318462]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 18:59:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:14 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:14.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:14.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:14 compute-1 ceph-mon[75484]: pgmap v2433: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3855943758' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 18:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:15 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:15 compute-1 sshd-session[318400]: Failed password for invalid user public from 192.210.160.141 port 56592 ssh2
Sep 30 18:59:15 compute-1 sshd-session[318373]: Received disconnect from 49.49.32.245 port 36534:11: Bye Bye [preauth]
Sep 30 18:59:15 compute-1 sshd-session[318373]: Disconnected from invalid user public 49.49.32.245 port 36534 [preauth]
Sep 30 18:59:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:16 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:16.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:16 compute-1 sshd-session[318462]: Failed password for invalid user sharp from 8.243.64.201 port 36360 ssh2
Sep 30 18:59:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:16.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:16 compute-1 nova_compute[238822]: 2025-09-30 18:59:16.883 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:16 compute-1 nova_compute[238822]: 2025-09-30 18:59:16.884 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:16 compute-1 ceph-mon[75484]: pgmap v2434: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:17 compute-1 nova_compute[238822]: 2025-09-30 18:59:17.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:17 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:17 compute-1 sshd-session[318462]: Received disconnect from 8.243.64.201 port 36360:11: Bye Bye [preauth]
Sep 30 18:59:17 compute-1 sshd-session[318462]: Disconnected from invalid user sharp 8.243.64.201 port 36360 [preauth]
Sep 30 18:59:18 compute-1 sshd-session[318400]: Connection closed by invalid user public 192.210.160.141 port 56592 [preauth]
Sep 30 18:59:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:18 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:18.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:18.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:18 compute-1 ceph-mon[75484]: pgmap v2435: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:19 compute-1 nova_compute[238822]: 2025-09-30 18:59:19.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:19 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: ERROR   18:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: ERROR   18:59:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: ERROR   18:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: ERROR   18:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: ERROR   18:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:59:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:59:19 compute-1 ceph-mon[75484]: pgmap v2436: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:20 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:20.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:20.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:21 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:22 compute-1 nova_compute[238822]: 2025-09-30 18:59:22.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:22 compute-1 nova_compute[238822]: 2025-09-30 18:59:22.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:22 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:22.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:22 compute-1 ceph-mon[75484]: pgmap v2437: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:59:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:22.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:23 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:24 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:24.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:24 compute-1 ceph-mon[75484]: pgmap v2438: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:59:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:24.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:25 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:25 compute-1 sudo[318476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:59:25 compute-1 sudo[318476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:25 compute-1 sudo[318476]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:26 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:26.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:26 compute-1 ceph-mon[75484]: pgmap v2439: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:26.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:27 compute-1 nova_compute[238822]: 2025-09-30 18:59:27.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:27 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:28 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:28.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:28 compute-1 ceph-mon[75484]: pgmap v2440: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:28.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:29 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:30 compute-1 nova_compute[238822]: 2025-09-30 18:59:30.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 18:59:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:30 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:30.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:30 compute-1 ceph-mon[75484]: pgmap v2441: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:30.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:31 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:32 compute-1 nova_compute[238822]: 2025-09-30 18:59:32.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:32 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:32.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:32.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:32 compute-1 ceph-mon[75484]: pgmap v2442: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:33 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:34 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:34.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:34.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:34 compute-1 ceph-mon[75484]: pgmap v2443: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:35 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:35 compute-1 podman[318512]: 2025-09-30 18:59:35.552322912 +0000 UTC m=+0.086143099 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 18:59:35 compute-1 podman[318511]: 2025-09-30 18:59:35.615982643 +0000 UTC m=+0.152900103 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 18:59:35 compute-1 podman[249638]: time="2025-09-30T18:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 18:59:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 18:59:35 compute-1 podman[249638]: @ - - [30/Sep/2025:18:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 18:59:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:36 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 18:59:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:36.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 18:59:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:36.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:36 compute-1 ceph-mon[75484]: pgmap v2444: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/715195180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:59:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/715195180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 18:59:37 compute-1 nova_compute[238822]: 2025-09-30 18:59:37.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:37 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:59:38 compute-1 sshd-session[318560]: Invalid user foundry from 103.153.190.105 port 51319
Sep 30 18:59:38 compute-1 sshd-session[318560]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 18:59:38 compute-1 sshd-session[318560]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105
Sep 30 18:59:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:38 compute-1 podman[318564]: 2025-09-30 18:59:38.358721972 +0000 UTC m=+0.076576021 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 18:59:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:38 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:38.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:38.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:38 compute-1 ceph-mon[75484]: pgmap v2445: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:39 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:40 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:40.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:40 compute-1 sshd-session[318560]: Failed password for invalid user foundry from 103.153.190.105 port 51319 ssh2
Sep 30 18:59:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:40.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:41 compute-1 ceph-mon[75484]: pgmap v2446: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:41 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:41 compute-1 unix_chkpwd[318589]: password check failed for user (root)
Sep 30 18:59:41 compute-1 sshd-session[318585]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 18:59:42 compute-1 sshd-session[318560]: Received disconnect from 103.153.190.105 port 51319:11: Bye Bye [preauth]
Sep 30 18:59:42 compute-1 sshd-session[318560]: Disconnected from invalid user foundry 103.153.190.105 port 51319 [preauth]
Sep 30 18:59:42 compute-1 ceph-mon[75484]: pgmap v2447: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:42 compute-1 nova_compute[238822]: 2025-09-30 18:59:42.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:42 compute-1 nova_compute[238822]: 2025-09-30 18:59:42.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:42 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:42.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:42.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:43 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:44 compute-1 sshd-session[318585]: Failed password for root from 192.210.160.141 port 56182 ssh2
Sep 30 18:59:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:44 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:44 compute-1 podman[318593]: 2025-09-30 18:59:44.58451467 +0000 UTC m=+0.121219967 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 18:59:44 compute-1 podman[318594]: 2025-09-30 18:59:44.589880805 +0000 UTC m=+0.120907528 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 18:59:44 compute-1 podman[318595]: 2025-09-30 18:59:44.600706218 +0000 UTC m=+0.125987706 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 18:59:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:44.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:44 compute-1 ceph-mon[75484]: pgmap v2448: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 18:59:44 compute-1 sudo[318649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:59:44 compute-1 sudo[318649]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:45 compute-1 sudo[318649]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:45 compute-1 sudo[318675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Sep 30 18:59:45 compute-1 sudo[318675]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:45 compute-1 sshd-session[318585]: Connection closed by authenticating user root 192.210.160.141 port 56182 [preauth]
Sep 30 18:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:45 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:45 compute-1 sudo[318675]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:45 compute-1 sudo[318721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 18:59:45 compute-1 sudo[318721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:45 compute-1 sudo[318721]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:45 compute-1 sudo[318746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 18:59:45 compute-1 sudo[318746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:46 compute-1 sudo[318778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 18:59:46 compute-1 sudo[318778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:46 compute-1 sudo[318778]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:46 compute-1 sudo[318746]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:46 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:46.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:46 compute-1 ceph-mon[75484]: pgmap v2449: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:59:46 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 18:59:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:46.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:47 compute-1 nova_compute[238822]: 2025-09-30 18:59:47.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:47 compute-1 nova_compute[238822]: 2025-09-30 18:59:47.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:47 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:47 compute-1 ceph-mon[75484]: pgmap v2450: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 564 B/s rd, 0 op/s
Sep 30 18:59:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 18:59:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 18:59:47 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 18:59:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:48 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:49 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: ERROR   18:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: ERROR   18:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: ERROR   18:59:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: ERROR   18:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: ERROR   18:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 18:59:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 18:59:49 compute-1 ceph-mon[75484]: pgmap v2451: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 564 B/s rd, 0 op/s
Sep 30 18:59:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:50 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:50.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:50.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:51 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:51 compute-1 ceph-mon[75484]: pgmap v2452: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 564 B/s rd, 0 op/s
Sep 30 18:59:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:51 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 18:59:51 compute-1 sudo[318835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 18:59:51 compute-1 sudo[318835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 18:59:51 compute-1 sudo[318835]: pam_unix(sudo:session): session closed for user root
Sep 30 18:59:52 compute-1 nova_compute[238822]: 2025-09-30 18:59:52.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:52 compute-1 nova_compute[238822]: 2025-09-30 18:59:52.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:52 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 18:59:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:52.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 18:59:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 18:59:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:52.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:53 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:53 compute-1 ceph-mon[75484]: pgmap v2453: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 564 B/s rd, 0 op/s
Sep 30 18:59:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:54 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:54.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:59:54.455 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 18:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:59:54.459 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 18:59:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 18:59:54.459 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 18:59:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:54.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:55 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:55 compute-1 ceph-mon[75484]: pgmap v2454: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 847 B/s rd, 0 op/s
Sep 30 18:59:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:56 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 18:59:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:56.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 18:59:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:56.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:57 compute-1 nova_compute[238822]: 2025-09-30 18:59:57.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:57 compute-1 nova_compute[238822]: 2025-09-30 18:59:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 18:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:57 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:57 compute-1 ceph-mon[75484]: pgmap v2455: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 564 B/s rd, 0 op/s
Sep 30 18:59:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:58 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:18:59:58.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1234171129' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 18:59:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1234171129' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 18:59:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 18:59:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 18:59:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:18:59:58.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 18:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 18:59:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 18:59:59 2025: (VI_0) received an invalid passwd!
Sep 30 18:59:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 18:59:59 compute-1 ceph-mon[75484]: pgmap v2456: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:00.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:00 compute-1 ceph-mon[75484]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Sep 30 19:00:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:00.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:01 compute-1 ceph-mon[75484]: pgmap v2457: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:02 compute-1 nova_compute[238822]: 2025-09-30 19:00:02.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:02.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:02.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:03 compute-1 ceph-mon[75484]: pgmap v2458: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:04.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:04.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:05 compute-1 ceph-mon[75484]: pgmap v2459: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:05 compute-1 podman[249638]: time="2025-09-30T19:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:00:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:00:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8367 "" "Go-http-client/1.1"
Sep 30 19:00:06 compute-1 nova_compute[238822]: 2025-09-30 19:00:06.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:06 compute-1 nova_compute[238822]: 2025-09-30 19:00:06.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 19:00:06 compute-1 sudo[318875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:00:06 compute-1 sudo[318875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:06 compute-1 sudo[318875]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:06 compute-1 podman[318900]: 2025-09-30 19:00:06.28798486 +0000 UTC m=+0.097293991 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 19:00:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:06 compute-1 podman[318899]: 2025-09-30 19:00:06.330250832 +0000 UTC m=+0.154807265 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller)
Sep 30 19:00:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:06.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:06.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:07 compute-1 nova_compute[238822]: 2025-09-30 19:00:07.061 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:07 compute-1 nova_compute[238822]: 2025-09-30 19:00:07.064 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:07 compute-1 nova_compute[238822]: 2025-09-30 19:00:07.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:07 compute-1 nova_compute[238822]: 2025-09-30 19:00:07.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:07 compute-1 ceph-mon[75484]: pgmap v2460: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:00:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:08.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:08 compute-1 podman[318955]: 2025-09-30 19:00:08.544043007 +0000 UTC m=+0.085530102 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 19:00:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:08.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:09 compute-1 nova_compute[238822]: 2025-09-30 19:00:09.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:09 compute-1 unix_chkpwd[318976]: password check failed for user (root)
Sep 30 19:00:09 compute-1 sshd-session[318952]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:00:09 compute-1 ceph-mon[75484]: pgmap v2461: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.062 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 19:00:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.585 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.586 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.586 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.587 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 19:00:10 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.587 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:00:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:10.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:00:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3259496954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:10.999 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.181 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.183 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.212 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.213 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4544MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.213 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:00:11 compute-1 nova_compute[238822]: 2025-09-30 19:00:11.214 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:11 compute-1 sshd-session[318952]: Failed password for root from 192.210.160.141 port 36110 ssh2
Sep 30 19:00:11 compute-1 ceph-mon[75484]: pgmap v2462: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3259496954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:12 compute-1 nova_compute[238822]: 2025-09-30 19:00:12.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:12.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:12 compute-1 nova_compute[238822]: 2025-09-30 19:00:12.473 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 19:00:12 compute-1 nova_compute[238822]: 2025-09-30 19:00:12.473 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:00:11 up  4:37,  0 user,  load average: 0.40, 0.58, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 19:00:12 compute-1 nova_compute[238822]: 2025-09-30 19:00:12.491 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:00:12 compute-1 sshd-session[318952]: Connection closed by authenticating user root 192.210.160.141 port 36110 [preauth]
Sep 30 19:00:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:12.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:00:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2437263130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:13 compute-1 nova_compute[238822]: 2025-09-30 19:00:13.054 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:00:13 compute-1 nova_compute[238822]: 2025-09-30 19:00:13.062 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 19:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:13 compute-1 nova_compute[238822]: 2025-09-30 19:00:13.624 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 19:00:13 compute-1 ceph-mon[75484]: pgmap v2463: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2437263130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:14 compute-1 nova_compute[238822]: 2025-09-30 19:00:14.140 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 19:00:14 compute-1 nova_compute[238822]: 2025-09-30 19:00:14.141 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.928s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:00:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:14.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2845511336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:14.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:15 compute-1 podman[319030]: 2025-09-30 19:00:15.592423804 +0000 UTC m=+0.106905070 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 19:00:15 compute-1 podman[319029]: 2025-09-30 19:00:15.595037214 +0000 UTC m=+0.115159753 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6)
Sep 30 19:00:15 compute-1 podman[319028]: 2025-09-30 19:00:15.626095444 +0000 UTC m=+0.147934089 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 19:00:15 compute-1 ceph-mon[75484]: pgmap v2464: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 8.7 KiB/s rd, 0 B/s wr, 14 op/s
Sep 30 19:00:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:16.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/741081939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:00:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:16.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:17 compute-1 nova_compute[238822]: 2025-09-30 19:00:17.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:17 compute-1 ceph-mon[75484]: pgmap v2465: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 8.5 KiB/s rd, 0 B/s wr, 13 op/s
Sep 30 19:00:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:18.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:18.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: ERROR   19:00:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: ERROR   19:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: ERROR   19:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: ERROR   19:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: ERROR   19:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:00:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:00:19 compute-1 ceph-mon[75484]: pgmap v2466: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Sep 30 19:00:20 compute-1 nova_compute[238822]: 2025-09-30 19:00:20.136 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:20.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:20.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:21 compute-1 nova_compute[238822]: 2025-09-30 19:00:21.052 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:21 compute-1 sshd-session[319090]: Invalid user geoserver from 49.49.32.245 port 59966
Sep 30 19:00:21 compute-1 sshd-session[319090]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:00:21 compute-1 sshd-session[319090]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 19:00:21 compute-1 ceph-mon[75484]: pgmap v2467: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 19:00:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:22 compute-1 nova_compute[238822]: 2025-09-30 19:00:22.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:22.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:22 compute-1 sshd-session[319094]: Invalid user client from 161.132.50.17 port 60758
Sep 30 19:00:22 compute-1 sshd-session[319094]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:00:22 compute-1 sshd-session[319094]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 19:00:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:00:22 compute-1 unix_chkpwd[319099]: password check failed for user (root)
Sep 30 19:00:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:22.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:22 compute-1 sshd-session[319096]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201  user=root
Sep 30 19:00:23 compute-1 nova_compute[238822]: 2025-09-30 19:00:23.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:23 compute-1 sshd-session[319090]: Failed password for invalid user geoserver from 49.49.32.245 port 59966 ssh2
Sep 30 19:00:23 compute-1 ceph-mon[75484]: pgmap v2468: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 19:00:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:24.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:24 compute-1 sshd-session[319090]: Received disconnect from 49.49.32.245 port 59966:11: Bye Bye [preauth]
Sep 30 19:00:24 compute-1 sshd-session[319090]: Disconnected from invalid user geoserver 49.49.32.245 port 59966 [preauth]
Sep 30 19:00:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:24.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:25 compute-1 sshd-session[319094]: Failed password for invalid user client from 161.132.50.17 port 60758 ssh2
Sep 30 19:00:25 compute-1 sshd-session[319094]: Received disconnect from 161.132.50.17 port 60758:11: Bye Bye [preauth]
Sep 30 19:00:25 compute-1 sshd-session[319094]: Disconnected from invalid user client 161.132.50.17 port 60758 [preauth]
Sep 30 19:00:25 compute-1 podman[249638]: time="2025-09-30T19:00:25Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:00:25 compute-1 podman[249638]: @ - - [30/Sep/2025:19:00:25 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 37443 "" "Go-http-client/1.1"
Sep 30 19:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:25 compute-1 sshd-session[319096]: Failed password for root from 8.243.64.201 port 40864 ssh2
Sep 30 19:00:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:25 compute-1 sshd-session[319096]: Received disconnect from 8.243.64.201 port 40864:11: Bye Bye [preauth]
Sep 30 19:00:25 compute-1 sshd-session[319096]: Disconnected from authenticating user root 8.243.64.201 port 40864 [preauth]
Sep 30 19:00:25 compute-1 ceph-mon[75484]: pgmap v2469: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 72 KiB/s rd, 0 B/s wr, 119 op/s
Sep 30 19:00:26 compute-1 sudo[319103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:00:26 compute-1 sudo[319103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:26 compute-1 sudo[319103]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:26.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:26.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:27 compute-1 nova_compute[238822]: 2025-09-30 19:00:27.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:27 compute-1 ceph-mon[75484]: pgmap v2470: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 64 KiB/s rd, 0 B/s wr, 106 op/s
Sep 30 19:00:28 compute-1 nova_compute[238822]: 2025-09-30 19:00:28.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:28.222 144543 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:d4:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '1e:91:34:f3:56:33'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 19:00:28 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:28.223 144543 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 19:00:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:28.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:28.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:28 compute-1 nova_compute[238822]: 2025-09-30 19:00:28.858 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:29 compute-1 ceph-mon[75484]: pgmap v2471: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 64 KiB/s rd, 0 B/s wr, 106 op/s
Sep 30 19:00:30 compute-1 nova_compute[238822]: 2025-09-30 19:00:30.060 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:30.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:30.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:31 compute-1 ceph-mon[75484]: pgmap v2472: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 31 KiB/s rd, 0 B/s wr, 51 op/s
Sep 30 19:00:32 compute-1 nova_compute[238822]: 2025-09-30 19:00:32.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:32 compute-1 nova_compute[238822]: 2025-09-30 19:00:32.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 19:00:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:32 compute-1 nova_compute[238822]: 2025-09-30 19:00:32.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:32.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:32.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:33 compute-1 ceph-mon[75484]: pgmap v2473: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:34.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:34.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:35 compute-1 nova_compute[238822]: 2025-09-30 19:00:35.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:35 compute-1 podman[249638]: time="2025-09-30T19:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:00:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:00:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8372 "" "Go-http-client/1.1"
Sep 30 19:00:35 compute-1 ceph-mon[75484]: pgmap v2474: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:36 compute-1 unix_chkpwd[319142]: password check failed for user (root)
Sep 30 19:00:36 compute-1 sshd-session[319138]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:00:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:36.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:36 compute-1 podman[319144]: 2025-09-30 19:00:36.557336093 +0000 UTC m=+0.090870577 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 19:00:36 compute-1 podman[319143]: 2025-09-30 19:00:36.615974307 +0000 UTC m=+0.148579436 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 19:00:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:36.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3465137377' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:00:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3465137377' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:00:37 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:37.225 144543 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=81ab3fff-d6d4-4262-9f24-1b212876e52c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 19:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:37 compute-1 nova_compute[238822]: 2025-09-30 19:00:37.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:37 compute-1 ceph-mon[75484]: pgmap v2475: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:00:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:38 compute-1 sshd-session[319138]: Failed password for root from 192.210.160.141 port 60294 ssh2
Sep 30 19:00:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:38.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:38.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:38 compute-1 ceph-mon[75484]: pgmap v2476: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:39 compute-1 sshd-session[319138]: Connection closed by authenticating user root 192.210.160.141 port 60294 [preauth]
Sep 30 19:00:39 compute-1 podman[319196]: 2025-09-30 19:00:39.551801236 +0000 UTC m=+0.095555643 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 19:00:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:40.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:41 compute-1 ceph-mon[75484]: pgmap v2477: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:42.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.566 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:42 compute-1 nova_compute[238822]: 2025-09-30 19:00:42.566 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 19:00:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:42.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:43 compute-1 nova_compute[238822]: 2025-09-30 19:00:43.074 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 19:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:43 compute-1 ceph-mon[75484]: pgmap v2478: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:44.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:44.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:45 compute-1 ceph-mon[75484]: pgmap v2479: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:46 compute-1 sudo[319223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:00:46 compute-1 sudo[319223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:46 compute-1 sudo[319223]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:46.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:46 compute-1 podman[319247]: 2025-09-30 19:00:46.531481167 +0000 UTC m=+0.077597308 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 19:00:46 compute-1 podman[319248]: 2025-09-30 19:00:46.551122628 +0000 UTC m=+0.088546344 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 19:00:46 compute-1 podman[319249]: 2025-09-30 19:00:46.567963353 +0000 UTC m=+0.092783619 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=multipathd, org.label-schema.license=GPLv2)
Sep 30 19:00:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:46.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:47 compute-1 nova_compute[238822]: 2025-09-30 19:00:47.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:47 compute-1 ceph-mon[75484]: pgmap v2480: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:48.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:48.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: ERROR   19:00:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: ERROR   19:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: ERROR   19:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: ERROR   19:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: ERROR   19:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:00:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:00:49 compute-1 ceph-mon[75484]: pgmap v2481: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:50.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:50.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:51 compute-1 ceph-mon[75484]: pgmap v2482: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:00:51 compute-1 sudo[319308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 19:00:51 compute-1 sudo[319308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:51 compute-1 sudo[319308]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:51 compute-1 sudo[319333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 19:00:51 compute-1 sudo[319333]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:52 compute-1 sudo[319333]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:52 compute-1 nova_compute[238822]: 2025-09-30 19:00:52.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:52 compute-1 nova_compute[238822]: 2025-09-30 19:00:52.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:52.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 19:00:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:00:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:52.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:53 compute-1 ceph-mon[75484]: pgmap v2483: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:00:53 compute-1 ceph-mon[75484]: pgmap v2484: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:00:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:54.461 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:54.461 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:00:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:00:54.461 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:00:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:54.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:54 compute-1 nova_compute[238822]: 2025-09-30 19:00:54.852 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:00:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:54.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:55 compute-1 ceph-mon[75484]: pgmap v2485: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:00:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:56.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:00:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:56.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:00:57 compute-1 nova_compute[238822]: 2025-09-30 19:00:57.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:00:57 compute-1 ceph-mon[75484]: pgmap v2486: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:00:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/108451082' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:00:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/108451082' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:00:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:00:57 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:00:57 compute-1 sudo[319395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 19:00:57 compute-1 sudo[319395]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:00:57 compute-1 sudo[319395]: pam_unix(sudo:session): session closed for user root
Sep 30 19:00:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:00:58.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:00:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:00:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:00:58.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:00:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:00:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:00:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:00:59 compute-1 ceph-mon[75484]: pgmap v2487: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:01:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:00.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:00.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:01 compute-1 CROND[319425]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 19:01:01 compute-1 run-parts[319428]: (/etc/cron.hourly) starting 0anacron
Sep 30 19:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:01 compute-1 run-parts[319434]: (/etc/cron.hourly) finished 0anacron
Sep 30 19:01:01 compute-1 CROND[319424]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 19:01:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:01 compute-1 ceph-mon[75484]: pgmap v2488: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:01:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:02 compute-1 nova_compute[238822]: 2025-09-30 19:01:02.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:02.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:02.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:03 compute-1 sshd-session[319435]: Invalid user david from 192.210.160.141 port 59634
Sep 30 19:01:03 compute-1 sshd-session[319435]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:01:03 compute-1 sshd-session[319435]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 19:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:03 compute-1 ceph-mon[75484]: pgmap v2489: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 610 B/s rd, 0 op/s
Sep 30 19:01:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:04.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:05 compute-1 sshd-session[319435]: Failed password for invalid user david from 192.210.160.141 port 59634 ssh2
Sep 30 19:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:05 compute-1 podman[249638]: time="2025-09-30T19:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:01:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:01:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8364 "" "Go-http-client/1.1"
Sep 30 19:01:05 compute-1 ceph-mon[75484]: pgmap v2490: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:06 compute-1 sudo[319442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:01:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:06.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:06 compute-1 sudo[319442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:01:06 compute-1 sudo[319442]: pam_unix(sudo:session): session closed for user root
Sep 30 19:01:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:06.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:07 compute-1 nova_compute[238822]: 2025-09-30 19:01:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:07 compute-1 nova_compute[238822]: 2025-09-30 19:01:07.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 19:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:07 compute-1 sshd-session[319435]: Connection closed by invalid user david 192.210.160.141 port 59634 [preauth]
Sep 30 19:01:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:07 compute-1 nova_compute[238822]: 2025-09-30 19:01:07.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:07 compute-1 podman[319469]: 2025-09-30 19:01:07.58534661 +0000 UTC m=+0.114200587 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 19:01:07 compute-1 podman[319468]: 2025-09-30 19:01:07.605231747 +0000 UTC m=+0.136918970 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 19:01:07 compute-1 ceph-mon[75484]: pgmap v2491: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:01:08 compute-1 nova_compute[238822]: 2025-09-30 19:01:08.058 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:08 compute-1 nova_compute[238822]: 2025-09-30 19:01:08.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:08.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:08.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:09 compute-1 ceph-mon[75484]: pgmap v2492: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:10 compute-1 podman[319519]: 2025-09-30 19:01:10.54971242 +0000 UTC m=+0.084105414 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 19:01:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:10.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:11 compute-1 ceph-mon[75484]: pgmap v2493: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:12.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.585 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.586 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.586 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.587 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 19:01:12 compute-1 nova_compute[238822]: 2025-09-30 19:01:12.587 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:01:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:12.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:01:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3842974402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.098 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.310 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.312 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.346 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.347 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4592MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.347 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:01:13 compute-1 nova_compute[238822]: 2025-09-30 19:01:13.347 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:01:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:13 compute-1 ceph-mon[75484]: pgmap v2494: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3842974402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.416 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.417 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:01:13 up  4:38,  0 user,  load average: 0.22, 0.50, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.434 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing inventories for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.449 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating ProviderTree inventory for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.450 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Updating inventory in ProviderTree for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.470 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing aggregate associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.499 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Refreshing trait associations for resource provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE41,HW_CPU_X86_ABM,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AESNI,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_TIS,HW_ARCH_X86_64,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 19:01:14 compute-1 nova_compute[238822]: 2025-09-30 19:01:14.520 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:01:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:14.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:14.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:01:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2711137893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:15 compute-1 nova_compute[238822]: 2025-09-30 19:01:15.024 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:01:15 compute-1 nova_compute[238822]: 2025-09-30 19:01:15.031 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 19:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:15 compute-1 nova_compute[238822]: 2025-09-30 19:01:15.541 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 19:01:15 compute-1 ceph-mon[75484]: pgmap v2495: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2711137893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:16 compute-1 nova_compute[238822]: 2025-09-30 19:01:16.054 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 19:01:16 compute-1 nova_compute[238822]: 2025-09-30 19:01:16.055 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.707s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:01:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:16.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1333483285' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:17 compute-1 nova_compute[238822]: 2025-09-30 19:01:17.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:17 compute-1 podman[319591]: 2025-09-30 19:01:17.586039951 +0000 UTC m=+0.112903002 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 19:01:17 compute-1 podman[319592]: 2025-09-30 19:01:17.595684342 +0000 UTC m=+0.116984453 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 19:01:17 compute-1 podman[319590]: 2025-09-30 19:01:17.600474711 +0000 UTC m=+0.133749295 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 19:01:17 compute-1 ceph-mon[75484]: pgmap v2496: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:18.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:18.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1862913231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: ERROR   19:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: ERROR   19:01:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: ERROR   19:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: ERROR   19:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: ERROR   19:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:01:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:01:19 compute-1 ceph-mon[75484]: pgmap v2497: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:20.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:20 compute-1 ceph-mon[75484]: pgmap v2498: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:21 compute-1 nova_compute[238822]: 2025-09-30 19:01:21.054 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:22 compute-1 nova_compute[238822]: 2025-09-30 19:01:22.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 19:01:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:22.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 19:01:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:22.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:23 compute-1 nova_compute[238822]: 2025-09-30 19:01:23.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:23 compute-1 ceph-mon[75484]: pgmap v2499: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:24.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:24.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:25 compute-1 nova_compute[238822]: 2025-09-30 19:01:25.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:25 compute-1 ceph-mon[75484]: pgmap v2500: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:26.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:26 compute-1 sudo[319656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:01:26 compute-1 sudo[319656]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:01:26 compute-1 sudo[319656]: pam_unix(sudo:session): session closed for user root
Sep 30 19:01:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 19:01:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 19:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:27 compute-1 nova_compute[238822]: 2025-09-30 19:01:27.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:27 compute-1 ceph-mon[75484]: pgmap v2501: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.588871) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887588923, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2338, "num_deletes": 251, "total_data_size": 5883634, "memory_usage": 5961168, "flush_reason": "Manual Compaction"}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887612809, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3797315, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65342, "largest_seqno": 67675, "table_properties": {"data_size": 3788050, "index_size": 5758, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19360, "raw_average_key_size": 20, "raw_value_size": 3769368, "raw_average_value_size": 3951, "num_data_blocks": 252, "num_entries": 954, "num_filter_entries": 954, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258677, "oldest_key_time": 1759258677, "file_creation_time": 1759258887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 23997 microseconds, and 15130 cpu microseconds.
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.612871) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3797315 bytes OK
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.612899) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.614722) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.614737) EVENT_LOG_v1 {"time_micros": 1759258887614732, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.614757) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5873293, prev total WAL file size 5873293, number of live WAL files 2.
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.616591) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3708KB)], [135(12MB)]
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887616756, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 16541791, "oldest_snapshot_seqno": -1}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 8691 keys, 14492665 bytes, temperature: kUnknown
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887696806, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 14492665, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14438263, "index_size": 31569, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21765, "raw_key_size": 229623, "raw_average_key_size": 26, "raw_value_size": 14286787, "raw_average_value_size": 1643, "num_data_blocks": 1225, "num_entries": 8691, "num_filter_entries": 8691, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.697309) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 14492665 bytes
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.698920) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.4 rd, 180.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 12.2 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(8.2) write-amplify(3.8) OK, records in: 9209, records dropped: 518 output_compression: NoCompression
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.698956) EVENT_LOG_v1 {"time_micros": 1759258887698940, "job": 86, "event": "compaction_finished", "compaction_time_micros": 80160, "compaction_time_cpu_micros": 46917, "output_level": 6, "num_output_files": 1, "total_output_size": 14492665, "num_input_records": 9209, "num_output_records": 8691, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887700603, "job": 86, "event": "table_file_deletion", "file_number": 137}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258887705976, "job": 86, "event": "table_file_deletion", "file_number": 135}
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.616379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.706132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.706142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.706146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.706149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:27 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:27.706153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:28.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:29 compute-1 ceph-mon[75484]: pgmap v2502: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:29 compute-1 sshd-session[319686]: Invalid user liang from 8.243.64.201 port 42502
Sep 30 19:01:29 compute-1 sshd-session[319686]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:01:29 compute-1 sshd-session[319686]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 19:01:30 compute-1 unix_chkpwd[319688]: password check failed for user (root)
Sep 30 19:01:30 compute-1 sshd-session[319683]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245  user=root
Sep 30 19:01:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:30.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:31 compute-1 nova_compute[238822]: 2025-09-30 19:01:31.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:01:31 compute-1 sshd-session[319683]: Failed password for root from 49.49.32.245 port 55160 ssh2
Sep 30 19:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.539125) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891539221, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 280, "num_deletes": 250, "total_data_size": 119069, "memory_usage": 125280, "flush_reason": "Manual Compaction"}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891543747, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 77444, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 67680, "largest_seqno": 67955, "table_properties": {"data_size": 75515, "index_size": 156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5343, "raw_average_key_size": 20, "raw_value_size": 71812, "raw_average_value_size": 270, "num_data_blocks": 7, "num_entries": 265, "num_filter_entries": 265, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258888, "oldest_key_time": 1759258888, "file_creation_time": 1759258891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 4656 microseconds, and 1540 cpu microseconds.
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.543799) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 77444 bytes OK
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.543828) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.545409) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.545431) EVENT_LOG_v1 {"time_micros": 1759258891545424, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.545452) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 116955, prev total WAL file size 116955, number of live WAL files 2.
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.546147) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323537' seq:72057594037927935, type:22 .. '6D6772737461740032353038' seq:0, type:0; will stop at (end)
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(75KB)], [138(13MB)]
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891546201, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 14570109, "oldest_snapshot_seqno": -1}
Sep 30 19:01:31 compute-1 sshd-session[319686]: Failed password for invalid user liang from 8.243.64.201 port 42502 ssh2
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 8449 keys, 10748090 bytes, temperature: kUnknown
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891620782, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 10748090, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10700115, "index_size": 25718, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 224781, "raw_average_key_size": 26, "raw_value_size": 10557663, "raw_average_value_size": 1249, "num_data_blocks": 980, "num_entries": 8449, "num_filter_entries": 8449, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759258891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.621223) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 10748090 bytes
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.623716) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.0 rd, 143.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 13.8 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(326.9) write-amplify(138.8) OK, records in: 8956, records dropped: 507 output_compression: NoCompression
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.623750) EVENT_LOG_v1 {"time_micros": 1759258891623734, "job": 88, "event": "compaction_finished", "compaction_time_micros": 74713, "compaction_time_cpu_micros": 48682, "output_level": 6, "num_output_files": 1, "total_output_size": 10748090, "num_input_records": 8956, "num_output_records": 8449, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891623949, "job": 88, "event": "table_file_deletion", "file_number": 140}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: pgmap v2503: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759258891629864, "job": 88, "event": "table_file_deletion", "file_number": 138}
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.546023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.629938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.629948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.629952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.629955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:01:31.629959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:01:31 compute-1 sshd-session[319683]: Received disconnect from 49.49.32.245 port 55160:11: Bye Bye [preauth]
Sep 30 19:01:31 compute-1 sshd-session[319683]: Disconnected from authenticating user root 49.49.32.245 port 55160 [preauth]
Sep 30 19:01:32 compute-1 sshd-session[319686]: Received disconnect from 8.243.64.201 port 42502:11: Bye Bye [preauth]
Sep 30 19:01:32 compute-1 sshd-session[319686]: Disconnected from invalid user liang 8.243.64.201 port 42502 [preauth]
Sep 30 19:01:32 compute-1 sshd-session[319693]: Invalid user crystal from 161.132.50.17 port 57184
Sep 30 19:01:32 compute-1 sshd-session[319693]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:01:32 compute-1 sshd-session[319693]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 19:01:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:32 compute-1 unix_chkpwd[319696]: password check failed for user (root)
Sep 30 19:01:32 compute-1 sshd-session[319690]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:01:32 compute-1 nova_compute[238822]: 2025-09-30 19:01:32.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:32.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:33 compute-1 ceph-mon[75484]: pgmap v2504: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:34 compute-1 sshd-session[319693]: Failed password for invalid user crystal from 161.132.50.17 port 57184 ssh2
Sep 30 19:01:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:34.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:34 compute-1 sshd-session[319690]: Failed password for root from 192.210.160.141 port 45882 ssh2
Sep 30 19:01:34 compute-1 sshd-session[319693]: Received disconnect from 161.132.50.17 port 57184:11: Bye Bye [preauth]
Sep 30 19:01:34 compute-1 sshd-session[319693]: Disconnected from invalid user crystal 161.132.50.17 port 57184 [preauth]
Sep 30 19:01:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:34.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:35 compute-1 sshd-session[319690]: Connection closed by authenticating user root 192.210.160.141 port 45882 [preauth]
Sep 30 19:01:35 compute-1 podman[249638]: time="2025-09-30T19:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:01:35 compute-1 ceph-mon[75484]: pgmap v2505: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:01:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 19:01:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:36.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1261945660' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:01:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1261945660' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:01:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:36.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:37 compute-1 nova_compute[238822]: 2025-09-30 19:01:37.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:37 compute-1 ceph-mon[75484]: pgmap v2506: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:01:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:38 compute-1 podman[319704]: 2025-09-30 19:01:38.564385023 +0000 UTC m=+0.092513921 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 19:01:38 compute-1 podman[319703]: 2025-09-30 19:01:38.602681608 +0000 UTC m=+0.139909862 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 19:01:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:38.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:38.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:39 compute-1 ceph-mon[75484]: pgmap v2507: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:40.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:40.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:41 compute-1 podman[319753]: 2025-09-30 19:01:41.551607071 +0000 UTC m=+0.087450914 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 19:01:41 compute-1 ceph-mon[75484]: pgmap v2508: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:42 compute-1 nova_compute[238822]: 2025-09-30 19:01:42.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:01:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:42.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:42.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:43 compute-1 ceph-mon[75484]: pgmap v2509: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:44.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:44.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:45 compute-1 ceph-mon[75484]: pgmap v2510: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:46.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:46 compute-1 sudo[319778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:01:46 compute-1 sudo[319778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:01:46 compute-1 sudo[319778]: pam_unix(sudo:session): session closed for user root
Sep 30 19:01:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:46.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:47 compute-1 nova_compute[238822]: 2025-09-30 19:01:47.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:47 compute-1 ceph-mon[75484]: pgmap v2511: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:48 compute-1 podman[319807]: 2025-09-30 19:01:48.565715275 +0000 UTC m=+0.095477101 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 19:01:48 compute-1 podman[319806]: 2025-09-30 19:01:48.567099802 +0000 UTC m=+0.103754825 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 19:01:48 compute-1 podman[319805]: 2025-09-30 19:01:48.593087895 +0000 UTC m=+0.133866399 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 19:01:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:48.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:48.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: ERROR   19:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: ERROR   19:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: ERROR   19:01:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: ERROR   19:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: ERROR   19:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:01:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:01:49 compute-1 ceph-mon[75484]: pgmap v2512: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:50.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:51 compute-1 ceph-mon[75484]: pgmap v2513: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:52 compute-1 nova_compute[238822]: 2025-09-30 19:01:52.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:52.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:01:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:52.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:53 compute-1 ceph-mon[75484]: pgmap v2514: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:01:54.463 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:01:54.463 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:01:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:01:54.463 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:01:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:54.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:54.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:55 compute-1 ceph-mon[75484]: pgmap v2515: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:01:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:01:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:56.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:01:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:56.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:57 compute-1 nova_compute[238822]: 2025-09-30 19:01:57.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:01:57 compute-1 ceph-mon[75484]: pgmap v2516: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1893448171' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:01:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1893448171' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:01:57 compute-1 sudo[319875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 19:01:57 compute-1 sudo[319875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:01:57 compute-1 sudo[319875]: pam_unix(sudo:session): session closed for user root
Sep 30 19:01:58 compute-1 sudo[319900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 19:01:58 compute-1 sudo[319900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:01:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:01:58.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:58 compute-1 sudo[319900]: pam_unix(sudo:session): session closed for user root
Sep 30 19:01:58 compute-1 unix_chkpwd[319958]: password check failed for user (root)
Sep 30 19:01:58 compute-1 sshd-session[319873]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 19:01:58 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:01:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:01:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:01:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:01:58.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:01:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:01:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:01:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:01:59 compute-1 ceph-mon[75484]: pgmap v2517: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:01:59 compute-1 ceph-mon[75484]: pgmap v2518: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 892 B/s rd, 0 op/s
Sep 30 19:02:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:00.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:00 compute-1 sshd-session[319873]: Failed password for root from 192.210.160.141 port 60572 ssh2
Sep 30 19:02:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:01 compute-1 ceph-mon[75484]: pgmap v2519: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 595 B/s rd, 0 op/s
Sep 30 19:02:01 compute-1 sshd-session[319873]: Connection closed by authenticating user root 192.210.160.141 port 60572 [preauth]
Sep 30 19:02:01 compute-1 unix_chkpwd[319964]: password check failed for user (root)
Sep 30 19:02:01 compute-1 sshd-session[319961]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.153.190.105  user=root
Sep 30 19:02:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:02 compute-1 nova_compute[238822]: 2025-09-30 19:02:02.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:02.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:02.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:03 compute-1 sudo[319967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 19:02:03 compute-1 sudo[319967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:02:03 compute-1 sudo[319967]: pam_unix(sudo:session): session closed for user root
Sep 30 19:02:03 compute-1 sshd-session[319961]: Failed password for root from 103.153.190.105 port 33813 ssh2
Sep 30 19:02:03 compute-1 ceph-mon[75484]: pgmap v2520: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 595 B/s rd, 0 op/s
Sep 30 19:02:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:02:03 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:02:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:04.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:04 compute-1 sshd-session[319961]: Received disconnect from 103.153.190.105 port 33813:11: Bye Bye [preauth]
Sep 30 19:02:04 compute-1 sshd-session[319961]: Disconnected from authenticating user root 103.153.190.105 port 33813 [preauth]
Sep 30 19:02:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:04.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:05 compute-1 podman[249638]: time="2025-09-30T19:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:02:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:02:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 19:02:05 compute-1 ceph-mon[75484]: pgmap v2521: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 595 B/s rd, 0 op/s
Sep 30 19:02:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:06.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:06 compute-1 sudo[319995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:02:06 compute-1 sudo[319995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:02:06 compute-1 sudo[319995]: pam_unix(sudo:session): session closed for user root
Sep 30 19:02:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 19:02:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:06.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 19:02:07 compute-1 nova_compute[238822]: 2025-09-30 19:02:07.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:07 compute-1 nova_compute[238822]: 2025-09-30 19:02:07.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 19:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:07 compute-1 nova_compute[238822]: 2025-09-30 19:02:07.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:07 compute-1 nova_compute[238822]: 2025-09-30 19:02:07.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:07 compute-1 ceph-mon[75484]: pgmap v2522: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 595 B/s rd, 0 op/s
Sep 30 19:02:07 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:02:08 compute-1 nova_compute[238822]: 2025-09-30 19:02:08.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:08.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:08.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:09 compute-1 ceph-mon[75484]: pgmap v2523: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 893 B/s rd, 0 op/s
Sep 30 19:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:09 compute-1 podman[320024]: 2025-09-30 19:02:09.552435732 +0000 UTC m=+0.083763324 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 19:02:09 compute-1 podman[320023]: 2025-09-30 19:02:09.609561316 +0000 UTC m=+0.145308718 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 19:02:10 compute-1 nova_compute[238822]: 2025-09-30 19:02:10.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:10 compute-1 nova_compute[238822]: 2025-09-30 19:02:10.565 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:10.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:10.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:12 compute-1 ceph-mon[75484]: pgmap v2524: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:12 compute-1 podman[320075]: 2025-09-30 19:02:12.559074425 +0000 UTC m=+0.094387101 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 19:02:12 compute-1 nova_compute[238822]: 2025-09-30 19:02:12.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:02:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:12.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000028s ======
Sep 30 19:02:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:12.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Sep 30 19:02:13 compute-1 ceph-mon[75484]: pgmap v2525: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.576 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.577 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.577 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 19:02:13 compute-1 nova_compute[238822]: 2025-09-30 19:02:13.578 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:02:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:02:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1699144910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1699144910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.084 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.313 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.315 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:02:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.353 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.354 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4575MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.354 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:02:14 compute-1 nova_compute[238822]: 2025-09-30 19:02:14.355 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:02:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:14.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:14.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:15 compute-1 ceph-mon[75484]: pgmap v2526: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:15 compute-1 nova_compute[238822]: 2025-09-30 19:02:15.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 19:02:15 compute-1 nova_compute[238822]: 2025-09-30 19:02:15.421 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:02:14 up  4:39,  0 user,  load average: 0.21, 0.45, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 19:02:15 compute-1 nova_compute[238822]: 2025-09-30 19:02:15.448 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:02:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:02:15 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2620052722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:15 compute-1 nova_compute[238822]: 2025-09-30 19:02:15.944 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:02:15 compute-1 nova_compute[238822]: 2025-09-30 19:02:15.954 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 19:02:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2620052722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:16 compute-1 nova_compute[238822]: 2025-09-30 19:02:16.466 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 19:02:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:16.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:16.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:16 compute-1 nova_compute[238822]: 2025-09-30 19:02:16.980 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 19:02:16 compute-1 nova_compute[238822]: 2025-09-30 19:02:16.981 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.626s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:02:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1403592302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:17 compute-1 ceph-mon[75484]: pgmap v2527: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:02:17 compute-1 nova_compute[238822]: 2025-09-30 19:02:17.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4104838524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:02:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:18.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:18.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:19 compute-1 ceph-mon[75484]: pgmap v2528: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: ERROR   19:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:02:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: ERROR   19:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: ERROR   19:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: ERROR   19:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: ERROR   19:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:02:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:02:19 compute-1 podman[320146]: 2025-09-30 19:02:19.602839187 +0000 UTC m=+0.099036477 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 19:02:19 compute-1 podman[320148]: 2025-09-30 19:02:19.612719174 +0000 UTC m=+0.100323762 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 19:02:19 compute-1 podman[320147]: 2025-09-30 19:02:19.641962305 +0000 UTC m=+0.131448954 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 19:02:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:20.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:20.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:21 compute-1 ceph-mon[75484]: pgmap v2529: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:21 compute-1 nova_compute[238822]: 2025-09-30 19:02:21.982 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:22 compute-1 nova_compute[238822]: 2025-09-30 19:02:22.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:22 compute-1 nova_compute[238822]: 2025-09-30 19:02:22.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:22.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:02:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:22.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:23 compute-1 ceph-mon[75484]: pgmap v2530: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:24.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:24.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:25 compute-1 nova_compute[238822]: 2025-09-30 19:02:25.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:25 compute-1 nova_compute[238822]: 2025-09-30 19:02:25.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:25 compute-1 unix_chkpwd[320215]: password check failed for user (root)
Sep 30 19:02:25 compute-1 sshd-session[320211]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:02:25 compute-1 ceph-mon[75484]: pgmap v2531: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:26.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:26.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:26 compute-1 sudo[320217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:02:27 compute-1 sudo[320217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:02:27 compute-1 sudo[320217]: pam_unix(sudo:session): session closed for user root
Sep 30 19:02:27 compute-1 sshd-session[320211]: Failed password for root from 192.210.160.141 port 47940 ssh2
Sep 30 19:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:27 compute-1 nova_compute[238822]: 2025-09-30 19:02:27.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:27 compute-1 nova_compute[238822]: 2025-09-30 19:02:27.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:27 compute-1 ceph-mon[75484]: pgmap v2532: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:28 compute-1 sshd-session[320211]: Connection closed by authenticating user root 192.210.160.141 port 47940 [preauth]
Sep 30 19:02:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:28.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:28.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:29 compute-1 ceph-mon[75484]: pgmap v2533: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:30.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:30.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:31 compute-1 ceph-mon[75484]: pgmap v2534: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:32 compute-1 nova_compute[238822]: 2025-09-30 19:02:32.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:02:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:32 compute-1 nova_compute[238822]: 2025-09-30 19:02:32.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:32.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:32.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:33 compute-1 ceph-mon[75484]: pgmap v2535: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:34.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:34 compute-1 sshd-session[320249]: Invalid user kyt from 49.49.32.245 port 50346
Sep 30 19:02:34 compute-1 sshd-session[320249]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:02:34 compute-1 sshd-session[320249]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 19:02:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:34.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:35 compute-1 sshd-session[320252]: Invalid user cpc from 8.243.64.201 port 39022
Sep 30 19:02:35 compute-1 sshd-session[320252]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:02:35 compute-1 sshd-session[320252]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 19:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:35 compute-1 podman[249638]: time="2025-09-30T19:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:02:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:02:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8367 "" "Go-http-client/1.1"
Sep 30 19:02:35 compute-1 ceph-mon[75484]: pgmap v2536: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:36.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:36 compute-1 sshd-session[320249]: Failed password for invalid user kyt from 49.49.32.245 port 50346 ssh2
Sep 30 19:02:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3932686305' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:02:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/3932686305' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:02:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:36.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:37 compute-1 nova_compute[238822]: 2025-09-30 19:02:37.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:37 compute-1 sshd-session[320252]: Failed password for invalid user cpc from 8.243.64.201 port 39022 ssh2
Sep 30 19:02:37 compute-1 ceph-mon[75484]: pgmap v2537: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:02:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:38 compute-1 sshd-session[320249]: Received disconnect from 49.49.32.245 port 50346:11: Bye Bye [preauth]
Sep 30 19:02:38 compute-1 sshd-session[320249]: Disconnected from invalid user kyt 49.49.32.245 port 50346 [preauth]
Sep 30 19:02:38 compute-1 unix_chkpwd[320260]: password check failed for user (root)
Sep 30 19:02:38 compute-1 sshd-session[320257]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17  user=root
Sep 30 19:02:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:38.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:38 compute-1 ceph-mon[75484]: pgmap v2538: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:39 compute-1 sshd-session[320252]: Received disconnect from 8.243.64.201 port 39022:11: Bye Bye [preauth]
Sep 30 19:02:39 compute-1 sshd-session[320252]: Disconnected from invalid user cpc 8.243.64.201 port 39022 [preauth]
Sep 30 19:02:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:39 compute-1 sshd-session[320257]: Failed password for root from 161.132.50.17 port 41434 ssh2
Sep 30 19:02:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:40 compute-1 podman[320264]: 2025-09-30 19:02:40.560873482 +0000 UTC m=+0.095387209 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 19:02:40 compute-1 podman[320263]: 2025-09-30 19:02:40.656764252 +0000 UTC m=+0.194705671 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 19:02:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:40.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:41 compute-1 sshd-session[320257]: Received disconnect from 161.132.50.17 port 41434:11: Bye Bye [preauth]
Sep 30 19:02:41 compute-1 sshd-session[320257]: Disconnected from authenticating user root 161.132.50.17 port 41434 [preauth]
Sep 30 19:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:41 compute-1 ceph-mon[75484]: pgmap v2539: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:42 compute-1 nova_compute[238822]: 2025-09-30 19:02:42.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:42.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:43 compute-1 podman[320316]: 2025-09-30 19:02:43.536503735 +0000 UTC m=+0.066241601 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 19:02:43 compute-1 ceph-mon[75484]: pgmap v2540: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:44.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:44.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:45 compute-1 ceph-mon[75484]: pgmap v2541: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:46.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:46.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:47 compute-1 sudo[320341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:02:47 compute-1 sudo[320341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:02:47 compute-1 sudo[320341]: pam_unix(sudo:session): session closed for user root
Sep 30 19:02:47 compute-1 auditd[704]: Audit daemon rotating log files
Sep 30 19:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:47 compute-1 nova_compute[238822]: 2025-09-30 19:02:47.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:47 compute-1 ceph-mon[75484]: pgmap v2542: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:48.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:49.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:49 compute-1 ceph-mon[75484]: pgmap v2543: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: ERROR   19:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: ERROR   19:02:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: ERROR   19:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: ERROR   19:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: ERROR   19:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:02:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:02:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:50 compute-1 podman[320369]: 2025-09-30 19:02:50.56005741 +0000 UTC m=+0.095681837 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 19:02:50 compute-1 podman[320370]: 2025-09-30 19:02:50.566547835 +0000 UTC m=+0.097290640 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Sep 30 19:02:50 compute-1 podman[320371]: 2025-09-30 19:02:50.573775701 +0000 UTC m=+0.100865727 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 19:02:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:50.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:51.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:51 compute-1 ceph-mon[75484]: pgmap v2544: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:02:52 compute-1 nova_compute[238822]: 2025-09-30 19:02:52.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:02:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:52.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:02:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:53.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:53 compute-1 unix_chkpwd[320432]: password check failed for user (root)
Sep 30 19:02:53 compute-1 sshd-session[320428]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:53 compute-1 ceph-mon[75484]: pgmap v2545: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:02:54.465 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:02:54.465 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:02:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:02:54.465 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:02:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:54.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:55.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:55 compute-1 sshd-session[320428]: Failed password for root from 192.210.160.141 port 47618 ssh2
Sep 30 19:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:55 compute-1 ceph-mon[75484]: pgmap v2546: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:56 compute-1 sshd-session[320428]: Connection closed by authenticating user root 192.210.160.141 port 47618 [preauth]
Sep 30 19:02:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 19:02:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:56.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 19:02:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:02:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:57.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:02:57 compute-1 ceph-mon[75484]: pgmap v2547: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:57 compute-1 nova_compute[238822]: 2025-09-30 19:02:57.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:02:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/538484799' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:02:58 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/538484799' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:02:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:02:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:02:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:02:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:02:59.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:02:59 compute-1 ceph-mon[75484]: pgmap v2548: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:02:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:02:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:02:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:00.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:01.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:01 compute-1 ceph-mon[75484]: pgmap v2549: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:02 compute-1 nova_compute[238822]: 2025-09-30 19:03:02.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:02.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:03.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:03 compute-1 sudo[320444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 19:03:03 compute-1 sudo[320444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:03 compute-1 sudo[320444]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:03 compute-1 sudo[320469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Sep 30 19:03:03 compute-1 sudo[320469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:03 compute-1 ceph-mon[75484]: pgmap v2550: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:04 compute-1 podman[320570]: 2025-09-30 19:03:04.764863613 +0000 UTC m=+0.103198240 container exec 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Sep 30 19:03:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:04.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:04 compute-1 podman[320570]: 2025-09-30 19:03:04.897091076 +0000 UTC m=+0.235425673 container exec_died 0535866a09e0c23740559c44eb2d7e5af5ea47876554b87edbed16bd5d1d9d73 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-mon-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Sep 30 19:03:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:05.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:05 compute-1 podman[320694]: 2025-09-30 19:03:05.62471535 +0000 UTC m=+0.085010758 container exec 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 19:03:05 compute-1 podman[320694]: 2025-09-30 19:03:05.642220193 +0000 UTC m=+0.102515531 container exec_died 9956b35da2e58851bff8bc9d71350555603a700e7509e039e6de231590eff173 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 19:03:05 compute-1 podman[249638]: time="2025-09-30T19:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:03:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:03:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8369 "" "Go-http-client/1.1"
Sep 30 19:03:05 compute-1 ceph-mon[75484]: pgmap v2551: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:06 compute-1 podman[320831]: 2025-09-30 19:03:06.498917124 +0000 UTC m=+0.093076747 container exec 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 19:03:06 compute-1 podman[320831]: 2025-09-30 19:03:06.536214592 +0000 UTC m=+0.130374205 container exec_died 4c6bcd7151aaa26c86a8419238ca2fe6e09a614d7abaf54fed4f8065c4fde842 (image=quay.io/ceph/haproxy:2.3, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-haproxy-nfs-cephfs-compute-1-iacknv)
Sep 30 19:03:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:06.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:06 compute-1 podman[320898]: 2025-09-30 19:03:06.894408772 +0000 UTC m=+0.086917590 container exec 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, version=2.2.4, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived)
Sep 30 19:03:06 compute-1 podman[320898]: 2025-09-30 19:03:06.940062695 +0000 UTC m=+0.132571473 container exec_died 4cf59fac528049a6955e8fe8f28835d7ba00eef4d65ac0833facf3bebcc2f15b (image=quay.io/ceph/keepalived:2.2.4, name=ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, architecture=x86_64, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 19:03:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:07.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:07 compute-1 ceph-mon[75484]: pgmap v2552: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:07 compute-1 sudo[320955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:03:07 compute-1 sudo[320955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:07 compute-1 sudo[320955]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:07 compute-1 sudo[320469]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:07 compute-1 sudo[320995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 19:03:07 compute-1 sudo[320995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:07 compute-1 sudo[320995]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:07 compute-1 sudo[321020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 19:03:07 compute-1 sudo[321020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:07 compute-1 nova_compute[238822]: 2025-09-30 19:03:07.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:07 compute-1 nova_compute[238822]: 2025-09-30 19:03:07.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:08 compute-1 sudo[321020]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:03:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:08.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000054s ======
Sep 30 19:03:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:09.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Sep 30 19:03:09 compute-1 nova_compute[238822]: 2025-09-30 19:03:09.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:09 compute-1 nova_compute[238822]: 2025-09-30 19:03:09.058 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-mon[75484]: pgmap v2553: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 534 B/s rd, 0 op/s
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:10 compute-1 nova_compute[238822]: 2025-09-30 19:03:10.059 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:10.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:11.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:11 compute-1 ceph-mon[75484]: pgmap v2554: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 534 B/s rd, 0 op/s
Sep 30 19:03:11 compute-1 podman[321083]: 2025-09-30 19:03:11.579221311 +0000 UTC m=+0.109834035 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 19:03:11 compute-1 podman[321082]: 2025-09-30 19:03:11.636088715 +0000 UTC m=+0.166068322 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 19:03:12 compute-1 nova_compute[238822]: 2025-09-30 19:03:12.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:12 compute-1 nova_compute[238822]: 2025-09-30 19:03:12.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:12 compute-1 nova_compute[238822]: 2025-09-30 19:03:12.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 19:03:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:12.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 19:03:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 19:03:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:13.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 19:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:13 compute-1 ceph-mon[75484]: pgmap v2555: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 534 B/s rd, 0 op/s
Sep 30 19:03:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:13 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:03:13 compute-1 sudo[321134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Sep 30 19:03:13 compute-1 sudo[321134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:13 compute-1 sudo[321134]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:14 compute-1 podman[321160]: 2025-09-30 19:03:14.504468617 +0000 UTC m=+0.100602998 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 19:03:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:14.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:15.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:15 compute-1 ceph-mon[75484]: pgmap v2556: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 801 B/s rd, 0 op/s
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.587 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.588 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.588 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.589 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 19:03:15 compute-1 nova_compute[238822]: 2025-09-30 19:03:15.589 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:03:16 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:03:16 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4272262692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.113 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:03:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.375 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.376 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:03:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.408 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.409 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4537MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.409 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:03:16 compute-1 nova_compute[238822]: 2025-09-30 19:03:16.409 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:03:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4272262692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2659476507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:16.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:17.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:17 compute-1 ceph-mon[75484]: pgmap v2557: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 534 B/s rd, 0 op/s
Sep 30 19:03:17 compute-1 nova_compute[238822]: 2025-09-30 19:03:17.564 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 19:03:17 compute-1 nova_compute[238822]: 2025-09-30 19:03:17.565 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=39GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:03:16 up  4:40,  0 user,  load average: 0.15, 0.38, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 19:03:17 compute-1 nova_compute[238822]: 2025-09-30 19:03:17.593 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:03:17 compute-1 nova_compute[238822]: 2025-09-30 19:03:17.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:17 compute-1 nova_compute[238822]: 2025-09-30 19:03:17.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:03:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3741651663' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:18 compute-1 nova_compute[238822]: 2025-09-30 19:03:18.079 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:03:18 compute-1 nova_compute[238822]: 2025-09-30 19:03:18.085 2 DEBUG nova.compute.provider_tree [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed in ProviderTree for provider: 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 19:03:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3741651663' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:18 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3878414782' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:03:18 compute-1 nova_compute[238822]: 2025-09-30 19:03:18.598 2 DEBUG nova.scheduler.client.report [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Inventory has not changed for provider 3f3ee6af-e4a5-4241-9f23-ffd3c2bbec8a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 39, 'reserved': 1, 'min_unit': 1, 'max_unit': 39, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 19:03:18 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:18 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:18 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:18.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:19 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:19 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:19 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:19.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:19 compute-1 nova_compute[238822]: 2025-09-30 19:03:19.109 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 19:03:19 compute-1 nova_compute[238822]: 2025-09-30 19:03:19.111 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.702s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:19 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:19 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: ERROR   19:03:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: ERROR   19:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: ERROR   19:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: ERROR   19:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: ERROR   19:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:03:19 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:03:19 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:19 compute-1 ceph-mon[75484]: pgmap v2558: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 534 B/s rd, 0 op/s
Sep 30 19:03:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:20 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:20 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:20 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:20 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:20 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:20.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:21 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:21 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:21 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:21.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:21 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:21 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:21 compute-1 ceph-mon[75484]: pgmap v2559: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:21 compute-1 podman[321233]: 2025-09-30 19:03:21.56817948 +0000 UTC m=+0.094874584 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 19:03:21 compute-1 podman[321231]: 2025-09-30 19:03:21.568354285 +0000 UTC m=+0.099088407 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 19:03:21 compute-1 podman[321232]: 2025-09-30 19:03:21.581099627 +0000 UTC m=+0.106644820 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm)
Sep 30 19:03:22 compute-1 unix_chkpwd[321289]: password check failed for user (root)
Sep 30 19:03:22 compute-1 sshd-session[321228]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:03:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:22 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:22 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:22 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:03:22 compute-1 nova_compute[238822]: 2025-09-30 19:03:22.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:22 compute-1 nova_compute[238822]: 2025-09-30 19:03:22.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:22 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:22 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:22 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:22.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:23 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:23 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:23 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:23 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:23 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:23 compute-1 ceph-mon[75484]: pgmap v2560: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:24 compute-1 nova_compute[238822]: 2025-09-30 19:03:24.112 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:24 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:24 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:24 compute-1 sshd-session[321228]: Failed password for root from 192.210.160.141 port 40758 ssh2
Sep 30 19:03:24 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:24 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:24 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:24 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:24.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:25 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:25 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:25 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:25.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:25 compute-1 sshd-session[321228]: Connection closed by authenticating user root 192.210.160.141 port 40758 [preauth]
Sep 30 19:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:25 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:25 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:25 compute-1 ceph-mon[75484]: pgmap v2561: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:26 compute-1 nova_compute[238822]: 2025-09-30 19:03:26.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:26 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:26 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:26 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:26 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:26 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:26.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:27 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:27 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:27 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:27.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:27 compute-1 nova_compute[238822]: 2025-09-30 19:03:27.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:27 compute-1 sudo[321296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:03:27 compute-1 sudo[321296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:27 compute-1 sudo[321296]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:27 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:27 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:27 compute-1 ceph-mon[75484]: pgmap v2562: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:27 compute-1 nova_compute[238822]: 2025-09-30 19:03:27.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:27 compute-1 nova_compute[238822]: 2025-09-30 19:03:27.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:28 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:28 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:28 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:28 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:28 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:28.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:29 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:29 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:29 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:29.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:29 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:29 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:29 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:29 compute-1 ceph-mon[75484]: pgmap v2563: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:30 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:30 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:30 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:30 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:30 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:31 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:31 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:31 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:31.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:31 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:31 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:31 compute-1 ceph-mon[75484]: pgmap v2564: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:32 compute-1 nova_compute[238822]: 2025-09-30 19:03:32.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:03:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:32 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:32 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:32 compute-1 nova_compute[238822]: 2025-09-30 19:03:32.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:32 compute-1 nova_compute[238822]: 2025-09-30 19:03:32.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:32 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:32 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:32 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:32.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:33 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:33 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:33 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:33.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:33 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:33 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:33 compute-1 ceph-mon[75484]: pgmap v2565: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:34 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:34 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:34 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:34 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:34 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:34 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:35 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:35 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:35 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:35.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:35 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:35 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:35 compute-1 podman[249638]: time="2025-09-30T19:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:03:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:03:35 compute-1 ceph-mon[75484]: pgmap v2566: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:35 compute-1 podman[249638]: @ - - [30/Sep/2025:19:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8367 "" "Go-http-client/1.1"
Sep 30 19:03:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:36 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:36 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Sep 30 19:03:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1763888224' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:03:36 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Sep 30 19:03:36 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1763888224' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:03:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1763888224' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:03:36 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/1763888224' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:03:36 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:36 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:36 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:36 compute-1 sshd-session[321329]: Invalid user asdf from 49.49.32.245 port 45540
Sep 30 19:03:36 compute-1 sshd-session[321329]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:03:36 compute-1 sshd-session[321329]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=49.49.32.245
Sep 30 19:03:37 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:37 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.002000053s ======
Sep 30 19:03:37 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:37.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Sep 30 19:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:37 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:37 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:37 compute-1 nova_compute[238822]: 2025-09-30 19:03:37.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:37 compute-1 nova_compute[238822]: 2025-09-30 19:03:37.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:37 compute-1 ceph-mon[75484]: pgmap v2567: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:37 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:03:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:38 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:38 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:38 compute-1 sshd-session[321329]: Failed password for invalid user asdf from 49.49.32.245 port 45540 ssh2
Sep 30 19:03:38 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:38 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:38 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:38.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:39 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:39 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:39 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:39.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:39 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:39 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:39 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:39 compute-1 sshd-session[321329]: Received disconnect from 49.49.32.245 port 45540:11: Bye Bye [preauth]
Sep 30 19:03:39 compute-1 sshd-session[321329]: Disconnected from invalid user asdf 49.49.32.245 port 45540 [preauth]
Sep 30 19:03:39 compute-1 ceph-mon[75484]: pgmap v2568: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:40 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:40 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:40 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:40 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:40 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:40.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:41 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:41 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:41 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:41.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:41 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:41 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:41 compute-1 ceph-mon[75484]: pgmap v2569: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:42 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:42 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:42 compute-1 podman[321341]: 2025-09-30 19:03:42.568413575 +0000 UTC m=+0.100048973 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 19:03:42 compute-1 nova_compute[238822]: 2025-09-30 19:03:42.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:42 compute-1 sshd-session[321337]: Invalid user banana from 8.243.64.201 port 46722
Sep 30 19:03:42 compute-1 sshd-session[321337]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:03:42 compute-1 sshd-session[321337]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.243.64.201
Sep 30 19:03:42 compute-1 podman[321340]: 2025-09-30 19:03:42.655861489 +0000 UTC m=+0.191987737 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, io.buildah.version=1.41.4)
Sep 30 19:03:42 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:42 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:42 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:42.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:43 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:43 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:43 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:43.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:43 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:43 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:43 compute-1 sshd-session[321393]: Invalid user kyu from 161.132.50.17 port 56424
Sep 30 19:03:43 compute-1 sshd-session[321393]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:03:43 compute-1 sshd-session[321393]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.132.50.17
Sep 30 19:03:43 compute-1 ceph-mon[75484]: pgmap v2570: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:44 compute-1 sshd-session[321337]: Failed password for invalid user banana from 8.243.64.201 port 46722 ssh2
Sep 30 19:03:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:44 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:44 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:44 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:44 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:44 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 19:03:44 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:44.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 19:03:45 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:45 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:45 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:45.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:45 compute-1 sshd-session[321393]: Failed password for invalid user kyu from 161.132.50.17 port 56424 ssh2
Sep 30 19:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:45 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:45 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:45 compute-1 sshd-session[321337]: Received disconnect from 8.243.64.201 port 46722:11: Bye Bye [preauth]
Sep 30 19:03:45 compute-1 sshd-session[321337]: Disconnected from invalid user banana 8.243.64.201 port 46722 [preauth]
Sep 30 19:03:45 compute-1 podman[321397]: 2025-09-30 19:03:45.543953589 +0000 UTC m=+0.080186361 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 19:03:45 compute-1 sshd-session[321391]: Invalid user h from 14.103.105.56 port 31266
Sep 30 19:03:45 compute-1 sshd-session[321391]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:03:45 compute-1 sshd-session[321391]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.103.105.56
Sep 30 19:03:45 compute-1 ceph-mon[75484]: pgmap v2571: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:45 compute-1 sshd-session[321417]: Accepted publickey for zuul from 192.168.122.10 port 53080 ssh2: ECDSA SHA256:COqAmbdBUMx+UyDa5XAop4Bi6qvwvYhJQ16rCseuHkQ
Sep 30 19:03:45 compute-1 sshd-session[321393]: Received disconnect from 161.132.50.17 port 56424:11: Bye Bye [preauth]
Sep 30 19:03:45 compute-1 sshd-session[321393]: Disconnected from invalid user kyu 161.132.50.17 port 56424 [preauth]
Sep 30 19:03:45 compute-1 systemd-logind[789]: New session 64 of user zuul.
Sep 30 19:03:45 compute-1 systemd[1]: Started Session 64 of User zuul.
Sep 30 19:03:45 compute-1 sshd-session[321417]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 19:03:46 compute-1 sudo[321421]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 19:03:46 compute-1 sudo[321421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 19:03:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:46 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:46 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:46 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:46 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:46 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:46.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:47 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:47 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:47 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:47.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:47 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:47 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:47 compute-1 sudo[321481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:03:47 compute-1 sudo[321481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:03:47 compute-1 sudo[321481]: pam_unix(sudo:session): session closed for user root
Sep 30 19:03:47 compute-1 nova_compute[238822]: 2025-09-30 19:03:47.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:47 compute-1 nova_compute[238822]: 2025-09-30 19:03:47.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:47 compute-1 sshd-session[321391]: Failed password for invalid user h from 14.103.105.56 port 31266 ssh2
Sep 30 19:03:47 compute-1 ceph-mon[75484]: pgmap v2572: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:48 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:48 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:48 compute-1 sshd-session[321391]: Received disconnect from 14.103.105.56 port 31266:11: Bye Bye [preauth]
Sep 30 19:03:48 compute-1 sshd-session[321391]: Disconnected from invalid user h 14.103.105.56 port 31266 [preauth]
Sep 30 19:03:48 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:48 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:48 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:48.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:48 compute-1 ceph-mon[75484]: pgmap v2573: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:48 compute-1 ceph-mon[75484]: from='client.19748 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:49 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:49 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:49 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:49.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:49 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:49 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: ERROR   19:03:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: ERROR   19:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: ERROR   19:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: ERROR   19:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: ERROR   19:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 19:03:49 compute-1 openstack_network_exporter[251957]: 
Sep 30 19:03:49 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:49 compute-1 unix_chkpwd[321653]: password check failed for user (root)
Sep 30 19:03:49 compute-1 sshd-session[321529]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141  user=root
Sep 30 19:03:50 compute-1 ceph-mon[75484]: from='client.19752 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:50 compute-1 ceph-mon[75484]: from='client.28023 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:50 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2211727363' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 19:03:50 compute-1 ceph-mon[75484]: from='client.28027 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:50 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "status"} v 0)
Sep 30 19:03:50 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2120423508' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 19:03:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:50 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:50 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:50 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:50 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:50 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:50.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:51 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2120423508' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Sep 30 19:03:51 compute-1 ceph-mon[75484]: pgmap v2574: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:03:51 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:51 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:51 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:51.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:51 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:51 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:51 compute-1 sshd-session[321529]: Failed password for root from 192.210.160.141 port 56974 ssh2
Sep 30 19:03:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:52 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:52 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:52 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:03:52 compute-1 podman[321754]: 2025-09-30 19:03:52.55911819 +0000 UTC m=+0.096396855 container health_status 53558103e0aeeb4137cd3d7d3df29abcd13297af66d7bfdceb070766f470e581 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Sep 30 19:03:52 compute-1 podman[321753]: 2025-09-30 19:03:52.561112634 +0000 UTC m=+0.089605163 container health_status 0a92ff87878b7311c826a588344c1a911fed4582d79912428488248f9129b0ff (image=38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 19:03:52 compute-1 podman[321755]: 2025-09-30 19:03:52.593240515 +0000 UTC m=+0.124199320 container health_status 84dc947974b7fae4844a697d3f955ab55c7833a6c984db9e9905a8074df0bdc7 (image=38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:03:52 compute-1 nova_compute[238822]: 2025-09-30 19:03:52.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:52 compute-1 sshd-session[321529]: Connection closed by authenticating user root 192.210.160.141 port 56974 [preauth]
Sep 30 19:03:52 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:52 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:52 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:52.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:53 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:53 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 19:03:53 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:53.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 19:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:53 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:53 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:53 compute-1 ceph-mon[75484]: pgmap v2575: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:53 compute-1 ovs-vsctl[321841]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 19:03:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:54 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:54 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:54 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:03:54.467 144543 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:03:54.468 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:03:54 compute-1 ovn_metadata_agent[144538]: 2025-09-30 19:03:54.468 144543 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:03:54 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:54 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:54 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:54.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:55 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:55 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:55 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:55.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:55 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 19:03:55 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 19:03:55 compute-1 virtqemud[239124]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 19:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:55 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:55 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:55 compute-1 ceph-mon[75484]: pgmap v2576: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 766 B/s rd, 0 op/s
Sep 30 19:03:55 compute-1 ceph-mon[75484]: from='client.19768 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/535653558' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 19:03:55 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1829078619' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:03:55 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: cache status {prefix=cache status} (starting...)
Sep 30 19:03:55 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: client ls {prefix=client ls} (starting...)
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:56 compute-1 lvm[322159]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Sep 30 19:03:56 compute-1 lvm[322159]: VG ceph_vg0 finished
Sep 30 19:03:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:56 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:56 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: damage ls {prefix=damage ls} (starting...)
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump loads {prefix=dump loads} (starting...)
Sep 30 19:03:56 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:56 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:56 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:56 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:56.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.19776 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.19784 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3934291758' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.19792 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1028912149' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 19:03:56 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2666808499' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "report"} v 0)
Sep 30 19:03:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/183509289' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 19:03:57 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:57 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:03:57 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:57.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:57 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:57 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Sep 30 19:03:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/588273531' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 nova_compute[238822]: 2025-09-30 19:03:57.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:03:57 compute-1 nova_compute[238822]: 2025-09-30 19:03:57.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config log"} v 0)
Sep 30 19:03:57 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/297771280' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: get subtrees {prefix=get subtrees} (starting...)
Sep 30 19:03:57 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:57 compute-1 ceph-mon[75484]: pgmap v2577: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.19806 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1985771736' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.28039 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/183509289' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3598764719' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/588273531' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4105764110' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2619271459' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.10:0/2619271459' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3990649454' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:03:57 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/297771280' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Sep 30 19:03:58 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: ops {prefix=ops} (starting...)
Sep 30 19:03:58 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config-key dump"} v 0)
Sep 30 19:03:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/315444950' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 19:03:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:58 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:58 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:58 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Sep 30 19:03:58 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2590978549' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 19:03:58 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:58 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:58 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:03:58.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:58 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: session ls {prefix=session ls} (starting...)
Sep 30 19:03:58 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub Can't run that command on an inactive MDS!
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.19814 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.28049 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.28057 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1145171778' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2693945981' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/315444950' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3612254954' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2590978549' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2731985441' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mds[85725]: mds.cephfs.compute-1.wibdub asok_command: status {prefix=status} (starting...)
Sep 30 19:03:59 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:03:59 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:03:59 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:03:59.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:03:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump"} v 0)
Sep 30 19:03:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3830742536' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:03:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:59 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:03:59 2025: (VI_0) received an invalid passwd!
Sep 30 19:03:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:03:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "features"} v 0)
Sep 30 19:03:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3405605842' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump"} v 0)
Sep 30 19:03:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/268009520' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:03:59 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Sep 30 19:03:59 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2113007865' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.28071 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: pgmap v2578: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.19852 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.28083 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3405482568' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.28091 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3830742536' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3409006296' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.19874 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1973545497' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3405605842' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.19882 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/268009520' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2113007865' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Sep 30 19:04:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/112825725' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:00 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:00 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr stat"} v 0)
Sep 30 19:04:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2610471172' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 19:04:00 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Sep 30 19:04:00 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4273556960' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 19:04:00 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:00 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:00 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:00.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/112825725' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.19890 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/71466818' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: pgmap v2579: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 766 B/s rd, 0 op/s
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3851135064' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.28115 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.19902 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1994101095' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2610471172' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4273556960' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Sep 30 19:04:01 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:01 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:04:01 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:01.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:04:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr versions"} v 0)
Sep 30 19:04:01 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2147346475' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:01 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:01 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Sep 30 19:04:01 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1199056701' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 19:04:01 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump"} v 0)
Sep 30 19:04:01 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3264826547' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.19914 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2147346475' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4251933188' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1199056701' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.19924 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.28135 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3700030127' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.19932 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:02 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3264826547' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Sep 30 19:04:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2996930861' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:02 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:04:02 compute-1 nova_compute[238822]: 2025-09-30 19:04:02.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1779908 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:47.740133+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:48.740409+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f62e1000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:49.740735+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:50.740956+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:51.741156+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1779908 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:52.741365+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:53.741596+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f530e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f34c8e5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f339ccf00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169623552 unmapped: 63692800 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f35cd05a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3ec00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.986249924s of 24.147270203s, submitted: 60
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3ec00 session 0x556f35cd01e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3594ba40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f35b48b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f32e043c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f3595a1e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:54.741859+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:55.742055+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:56.742225+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1826835 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:57.742484+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:58.742692+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5c00 session 0x556f3660c1e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:31:59.742902+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:00.743077+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:01.743296+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1826835 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3660c780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5df1000/0x0/0x4ffc00000, data 0x26ba4e3/0x278b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:02.743505+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169541632 unmapped: 63774720 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f3660cb40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f3595ad20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:03.743702+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169877504 unmapped: 63438848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.096412659s of 10.222342491s, submitted: 26
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:04.743953+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 169877504 unmapped: 63438848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:05.744208+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:06.744413+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1866076 data_alloc: 218103808 data_used: 7753728
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:07.744717+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5dca000/0x0/0x4ffc00000, data 0x26e14e3/0x27b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:08.744891+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:09.745101+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:10.745310+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5dca000/0x0/0x4ffc00000, data 0x26e14e3/0x27b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x767f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:11.745516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1866076 data_alloc: 218103808 data_used: 7753728
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:12.745762+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:13.745972+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:14.746168+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 170041344 unmapped: 63275008 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.062009811s of 11.069645882s, submitted: 2
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:15.746338+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174768128 unmapped: 58548224 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:16.746542+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995228 data_alloc: 218103808 data_used: 8646656
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5800 session 0x556f35cd03c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bfc000/0x0/0x4ffc00000, data 0x37074e3/0x37d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:17.746748+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:18.746915+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:19.747075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:20.747261+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:21.747445+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175046656 unmapped: 58269696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2011778 data_alloc: 218103808 data_used: 8646656
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bfc000/0x0/0x4ffc00000, data 0x37074e3/0x37d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:22.747658+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:23.747868+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:24.748030+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:25.748221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174923776 unmapped: 58392576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:26.748385+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3c01000/0x0/0x4ffc00000, data 0x370a4e3/0x37db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.202202797s of 11.535426140s, submitted: 133
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5000 session 0x556f359883c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175259648 unmapped: 58056704 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2009643 data_alloc: 218103808 data_used: 8646656
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f359c5800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:27.748599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:28.748819+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:29.749004+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:30.749226+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:31.749394+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2013271 data_alloc: 218103808 data_used: 9179136
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:32.749730+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175267840 unmapped: 58048512 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:33.749949+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:34.750142+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:35.750303+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:36.750506+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2013271 data_alloc: 218103808 data_used: 9179136
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:37.750804+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175276032 unmapped: 58040320 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3bda000/0x0/0x4ffc00000, data 0x37314e3/0x3802000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.478668213s of 11.507327080s, submitted: 7
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:38.750958+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:39.751128+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:40.751310+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3553000/0x0/0x4ffc00000, data 0x3db74e3/0x3e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:41.751529+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2079641 data_alloc: 218103808 data_used: 9338880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:42.751718+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:43.751936+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178151424 unmapped: 55164928 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:44.778856+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:45.779019+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:46.779170+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3551000/0x0/0x4ffc00000, data 0x3dba4e3/0x3e8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076729 data_alloc: 218103808 data_used: 9338880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:47.779377+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:48.779539+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:49.779735+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:50.779956+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:51.780190+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:52.780463+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:53.780709+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:54.780889+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:55.781049+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:56.781216+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:57.781377+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:58.781549+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:32:59.781703+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:00.781836+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:01.782008+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076865 data_alloc: 218103808 data_used: 9338880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:02.782175+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.340782166s of 24.475608826s, submitted: 50
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177979392 unmapped: 55336960 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:03.782332+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177979392 unmapped: 55336960 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3550000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:04.782517+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:05.782672+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:06.782824+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2079033 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:07.783018+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:08.783130+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:09.783291+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3550000/0x0/0x4ffc00000, data 0x3dbb4e3/0x3e8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:10.783453+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177987584 unmapped: 55328768 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:11.783713+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2077153 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:12.783919+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:13.784097+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:14.784289+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 177995776 unmapped: 55320576 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:15.784454+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:16.784678+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2077153 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:17.784956+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178020352 unmapped: 55296000 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:18.785177+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178028544 unmapped: 55287808 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354e000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:19.785361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c34800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.368265152s of 17.390548706s, submitted: 16
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3afe5400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:20.785494+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5800 session 0x556f3660de00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5000 session 0x556f35cd14a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:21.785684+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:22.785880+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:23.786041+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:24.786264+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:25.786454+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:26.786828+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178036736 unmapped: 55279616 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:27.787038+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:28.787268+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:29.787470+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:30.787680+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:31.787800+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:32.788179+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:33.788376+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178044928 unmapped: 55271424 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:34.788592+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178053120 unmapped: 55263232 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:35.788843+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:36.788999+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:37.789207+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:38.789415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:39.789766+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:40.789962+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:41.790170+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:42.790348+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178061312 unmapped: 55255040 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:43.790547+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:44.790751+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:45.790954+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:46.791179+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076725 data_alloc: 218103808 data_used: 9326592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.392583847s of 27.433944702s, submitted: 7
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:47.791460+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:48.791740+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35ef1680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f359c5800 session 0x556f35c370e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:49.791945+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178069504 unmapped: 55246848 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:50.792091+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:51.792234+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:52.792469+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:53.792692+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:54.792912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:55.793126+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:56.793303+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:57.793499+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:58.793698+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178077696 unmapped: 55238656 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:33:59.793834+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:00.794012+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:01.794176+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2076877 data_alloc: 218103808 data_used: 9330688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:02.794342+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:03.794508+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38303000 session 0x556f33e23e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.728868484s of 16.749923706s, submitted: 3
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f32f4f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178085888 unmapped: 55230464 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:04.794766+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f354f000/0x0/0x4ffc00000, data 0x3dbc4e3/0x3e8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178094080 unmapped: 55222272 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:05.794920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38303000 session 0x556f35b56b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:06.795142+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2000073 data_alloc: 218103808 data_used: 8634368
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:07.795371+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:08.795560+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d97000/0x0/0x4ffc00000, data 0x35744e3/0x3645000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:09.795735+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:10.795907+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178118656 unmapped: 55197696 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c34800 session 0x556f35968780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3afe5400 session 0x556f3579de00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:11.796068+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d97000/0x0/0x4ffc00000, data 0x35744e3/0x3645000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178126848 unmapped: 55189504 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1814213 data_alloc: 218103808 data_used: 2777088
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cba000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:12.796210+0000)
Sep 30 19:04:02 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3d98000/0x0/0x4ffc00000, data 0x35744d3/0x3644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:13.796353+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:14.796515+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:15.796710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:16.796877+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:17.797143+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:18.797308+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:19.797495+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:20.797712+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:21.797885+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:22.798085+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:23.798257+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:24.798437+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:25.798577+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:26.798749+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f5142000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1807005 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:27.798946+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:28.799105+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:29.799257+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:30.799414+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174120960 unmapped: 59195392 heap: 233316352 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35ef1c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f361fb680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f36632f00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35ef05a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:31.799520+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35815400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.056009293s of 27.376161575s, submitted: 74
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35815400 session 0x556f35b56960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1890859 data_alloc: 218103808 data_used: 2658304
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:32.799672+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:33.799798+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:34.799954+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:35.800113+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:36.800307+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1891011 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:37.800569+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:38.800754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:39.800927+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4610000/0x0/0x4ffc00000, data 0x2cfc4d3/0x2dcc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:40.801080+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:41.801255+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174424064 unmapped: 66764800 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:42.801421+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1891011 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.036623001s of 11.114352226s, submitted: 13
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f550e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174587904 unmapped: 66600960 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:43.801647+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174587904 unmapped: 66600960 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f45e9000/0x0/0x4ffc00000, data 0x2d234d3/0x2df3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:44.801838+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c3f800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 174628864 unmapped: 66560000 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:45.801993+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:46.802164+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:47.802398+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972299 data_alloc: 234881024 data_used: 14307328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:48.802581+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:49.802771+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f45e9000/0x0/0x4ffc00000, data 0x2d234d3/0x2df3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:50.802918+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:51.803141+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:52.803290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1972299 data_alloc: 234881024 data_used: 14307328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:53.803515+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175874048 unmapped: 65314816 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:54.803720+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.263117790s of 12.270321846s, submitted: 1
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 182788096 unmapped: 58400768 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:55.803876+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4122000/0x0/0x4ffc00000, data 0x31ea4d3/0x32ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [0,0,0,0,0,0,0,11])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178642944 unmapped: 62545920 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:56.804060+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35cd01e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f3594b4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f3595bc20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35b49a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191184896 unmapped: 50003968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:57.804275+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2151535 data_alloc: 234881024 data_used: 15806464
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35969a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:58.804493+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:34:59.804721+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:00.804923+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:01.805094+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:02.805278+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2159251 data_alloc: 234881024 data_used: 15892480
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:03.805466+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:04.805699+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:05.805857+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:06.806064+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:07.806296+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2159251 data_alloc: 234881024 data_used: 15892480
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e7a000/0x0/0x4ffc00000, data 0x44924d3/0x4562000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179240960 unmapped: 61947904 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:08.806415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.111789703s of 13.742190361s, submitted: 80
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f35b48000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179208192 unmapped: 61980672 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:09.806568+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179208192 unmapped: 61980672 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:10.806772+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178962432 unmapped: 62226432 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:11.806955+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 185188352 unmapped: 56000512 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:12.807131+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2246248 data_alloc: 234881024 data_used: 27807744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:13.807362+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:14.807516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:15.807687+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186089472 unmapped: 55099392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:16.807836+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:17.808011+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2246248 data_alloc: 234881024 data_used: 27807744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:18.808229+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2e52000/0x0/0x4ffc00000, data 0x44b94f6/0x458a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x881f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:19.808433+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:20.808663+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186105856 unmapped: 55083008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:21.808807+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.740138054s of 12.930875778s, submitted: 5
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191135744 unmapped: 50053120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:22.808955+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2327738 data_alloc: 234881024 data_used: 29638656
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:23.809097+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215b000/0x0/0x4ffc00000, data 0x4d984f6/0x4e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:24.809299+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:25.809468+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:26.809659+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:27.809844+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2340190 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:28.810074+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193118208 unmapped: 48070656 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2159000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:29.810253+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:30.810382+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:31.810562+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:32.810757+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:33.810912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:34.811129+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:35.811337+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:36.811562+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:37.811896+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:38.812164+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:39.812377+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:40.812692+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:41.812855+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193142784 unmapped: 48046080 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:42.813079+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:43.813273+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:44.813409+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193150976 unmapped: 48037888 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:45.813605+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:46.813838+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:47.814060+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:48.814253+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:49.814458+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193159168 unmapped: 48029696 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:50.814694+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:51.814884+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2161000/0x0/0x4ffc00000, data 0x4d9a4f6/0x4e6b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:52.815082+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2334846 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:53.815310+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.951963425s of 32.214107513s, submitted: 106
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:54.815501+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:55.815661+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:56.815914+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:57.816227+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193167360 unmapped: 48021504 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:58.816377+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f33c9e5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35b574a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:35:59.816581+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:00.816952+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:01.817187+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:02.817728+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:03.818290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:04.818487+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193183744 unmapped: 48005120 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f215f000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:05.818829+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:06.819144+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:07.819343+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2335150 data_alloc: 251658240 data_used: 29822976
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:08.819542+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193200128 unmapped: 47988736 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.455149651s of 15.468193054s, submitted: 3
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:09.819732+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:10.819937+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:11.820210+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:12.820429+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193232896 unmapped: 47955968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:13.820790+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:14.821018+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:15.821204+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193249280 unmapped: 47939584 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:16.821353+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:17.821656+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:18.821885+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:19.822107+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193257472 unmapped: 47931392 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:20.822319+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:21.822529+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:22.822718+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:23.822875+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193273856 unmapped: 47915008 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:24.823104+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193290240 unmapped: 47898624 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:25.823290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193306624 unmapped: 47882240 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:26.823572+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:27.823920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:28.824146+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c3f800 session 0x556f35968d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35e39a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:29.824344+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:30.824534+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193323008 unmapped: 47865856 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:31.824724+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:32.824888+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:33.825098+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:34.825559+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:35.825754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:36.826716+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:37.827133+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:38.827718+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193331200 unmapped: 47857664 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:39.828236+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:40.828712+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2160000/0x0/0x4ffc00000, data 0x4d9b4f6/0x4e6c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:41.829134+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:42.829393+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f35cba5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f32604f00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193339392 unmapped: 47849472 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2337150 data_alloc: 234881024 data_used: 29810688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:43.829571+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.486000061s of 34.496078491s, submitted: 13
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187080704 unmapped: 54108160 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f32f481e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:44.829819+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:45.830149+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:46.830327+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:47.830689+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2078550 data_alloc: 234881024 data_used: 15892480
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:48.830948+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:49.831156+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f36a8000/0x0/0x4ffc00000, data 0x38534d3/0x3923000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:50.831389+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:51.831553+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:52.831706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f356094a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187088896 unmapped: 54099968 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2078550 data_alloc: 234881024 data_used: 15892480
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:53.832041+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.035298347s of 10.127894402s, submitted: 34
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f343910e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:54.832335+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:55.832572+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:56.832898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:57.833201+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:58.833530+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:36:59.833823+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:00.834070+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:01.834429+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:02.834790+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:03.835003+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:04.835222+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:05.835431+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:06.835716+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:07.835956+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:08.836106+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:09.836332+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:10.836495+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:11.836706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:12.836865+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:13.836987+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:14.837178+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:15.837325+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:16.837492+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:17.837664+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:18.837815+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:19.838025+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:20.838251+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:21.838407+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:22.838645+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:23.838837+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:24.838985+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:25.839142+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:26.839373+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:27.839665+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:28.839858+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:29.840017+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:30.840152+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:31.840404+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:32.840572+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175644672 unmapped: 65544192 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:33.840745+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:34.840926+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:35.841107+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:36.841318+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:37.841605+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831594 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:38.841785+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cd0780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f361fb4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3594a960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 175661056 unmapped: 65527808 heap: 241188864 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c000 session 0x556f35b48780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 45.478294373s of 45.484821320s, submitted: 2
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:39.841990+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cbb4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f326050e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f36633a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc450c/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [0,0,1])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f33e23e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d36800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d36800 session 0x556f32f4f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:40.842179+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc4545/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:41.842372+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:42.842562+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1926394 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:43.842796+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35b80d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:44.843002+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:45.843169+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4136000/0x0/0x4ffc00000, data 0x2dc4545/0x2e96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:46.843448+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f35609680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176922624 unmapped: 67944448 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:47.843706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35608d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176930816 unmapped: 67936256 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f35609a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1928373 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:48.843871+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176939008 unmapped: 67928064 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:49.844035+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 176939008 unmapped: 67928064 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:50.844180+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:51.844378+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3600.1 total, 600.0 interval
                                           Cumulative writes: 18K writes, 73K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.02 MB/s
                                           Cumulative WAL: 18K writes, 5879 syncs, 3.22 writes per sync, written: 0.07 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2910 writes, 11K keys, 2910 commit groups, 1.0 writes per commit group, ingest: 12.65 MB, 0.02 MB/s
                                           Interval WAL: 2910 writes, 1118 syncs, 2.60 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread fragmentation_score=0.000897 took=0.000048s
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:52.844589+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2006025 data_alloc: 234881024 data_used: 14098432
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:53.844926+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:54.845141+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:55.845348+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:56.845570+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:57.845804+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2006025 data_alloc: 234881024 data_used: 14098432
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:58.846057+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 178421760 unmapped: 66445312 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:37:59.846277+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.143117905s of 20.674848557s, submitted: 50
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 183222272 unmapped: 61644800 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:00.846464+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4134000/0x0/0x4ffc00000, data 0x2dc4578/0x2e98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [0,0,0,0,0,0,0,7,29,13])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186425344 unmapped: 58441728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:01.846677+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f35e392c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35cd1e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f32604000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186499072 unmapped: 58368000 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f34c8f0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:02.846948+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f35968b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35040000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35040000 session 0x556f366334a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f3580cd20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f3660c000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f36632d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34ca000/0x0/0x4ffc00000, data 0x3a2c5ea/0x3b02000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2112184 data_alloc: 234881024 data_used: 14172160
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:03.847149+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:04.847304+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:05.847491+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3594be00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:06.847678+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:07.847930+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34ca000/0x0/0x4ffc00000, data 0x3a2c5ea/0x3b02000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186433536 unmapped: 58433536 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2111048 data_alloc: 234881024 data_used: 14172160
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:08.848146+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c43c00 session 0x556f32b1f0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186449920 unmapped: 58417152 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:09.848476+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f34391860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f32b1ed20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e09800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:10.848686+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:11.848913+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a1000/0x0/0x4ffc00000, data 0x3a555ea/0x3b2b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:12.849086+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:13.849273+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2118024 data_alloc: 234881024 data_used: 14798848
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.214136124s of 13.708144188s, submitted: 134
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:14.849463+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:15.849632+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:16.849849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:17.850079+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:18.850238+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2118200 data_alloc: 234881024 data_used: 14798848
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:19.850424+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:20.850550+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f34a0000/0x0/0x4ffc00000, data 0x3a565ea/0x3b2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:21.850673+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186277888 unmapped: 58589184 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:22.850825+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187023360 unmapped: 57843712 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:23.851046+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2177924 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.751893997s of 10.002759933s, submitted: 63
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:24.851209+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:25.851398+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41b75ea/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:26.851584+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:27.851892+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:28.852030+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2190656 data_alloc: 234881024 data_used: 15032320
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:29.852160+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41b75ea/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:30.852313+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:31.852531+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d1b000/0x0/0x4ffc00000, data 0x41db5ea/0x42b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:32.852747+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:33.852966+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189432 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:34.853167+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:35.853379+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:36.853538+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:37.853774+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d1b000/0x0/0x4ffc00000, data 0x41db5ea/0x42b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:38.853955+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189432 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:39.854157+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:40.854306+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:41.854547+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.831987381s of 17.884922028s, submitted: 24
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:42.854708+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:43.854886+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:44.855045+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:45.855205+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:46.855382+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:47.855603+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:48.855995+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:49.856191+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:50.856362+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:51.856553+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:52.856725+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:53.856911+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189440 data_alloc: 234881024 data_used: 15036416
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.778059959s of 12.799883842s, submitted: 3
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:54.857069+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:55.857283+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:56.857520+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:57.857739+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:58.857879+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f3662be00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f34c8e780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:38:59.858078+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187760640 unmapped: 57106432 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:00.858244+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:01.858389+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:02.858540+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:03.858748+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:04.858925+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:05.859137+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:06.859356+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:07.860796+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187768832 unmapped: 57098240 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:08.861124+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189076 data_alloc: 234881024 data_used: 15044608
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:09.861265+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:10.861448+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187777024 unmapped: 57090048 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:11.861658+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.156997681s of 17.189851761s, submitted: 8
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e09800 session 0x556f356092c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f33d37400 session 0x556f3580cd20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187785216 unmapped: 57081856 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d18000/0x0/0x4ffc00000, data 0x41de5ea/0x42b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:12.861799+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32529000 session 0x556f35074b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:13.861990+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2108730 data_alloc: 234881024 data_used: 14180352
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:14.862164+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:15.862382+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:16.862593+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:17.862917+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:18.863085+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2108730 data_alloc: 234881024 data_used: 14180352
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:19.863260+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:20.863449+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3531000/0x0/0x4ffc00000, data 0x3964578/0x3a38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:21.863700+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:22.863868+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.129077911s of 11.197218895s, submitted: 23
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f3595ad20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3660dc20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:23.864112+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 187817984 unmapped: 57049088 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1864004 data_alloc: 218103808 data_used: 2670592
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d01c00 session 0x556f35c363c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:24.864329+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:25.864511+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:26.864783+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:27.865135+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:28.865412+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:29.865679+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:30.865941+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:31.866241+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489000 session 0x556f3662b0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d01c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f33e234a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32529000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:32.866569+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:33.866713+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489800 session 0x556f33e23680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33d37400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:34.866919+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:35.867179+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:36.867367+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:37.867592+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:38.867793+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1860199 data_alloc: 218103808 data_used: 2662400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:39.868010+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180035584 unmapped: 64831488 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d2f000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.397924423s of 17.578964233s, submitted: 48
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:40.868160+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180068352 unmapped: 64798720 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:41.868411+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180092928 unmapped: 64774144 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:42.868755+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180092928 unmapped: 64774144 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:43.868929+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180199424 unmapped: 64667648 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:44.869114+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:45.869370+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:46.869572+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180264960 unmapped: 64602112 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:47.869843+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:48.870022+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:49.870221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:50.870464+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:51.870642+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:52.870849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:53.871046+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:54.871386+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:55.871592+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:56.871783+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:57.872033+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:58.872173+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:39:59.872366+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:00.872717+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:01.872964+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:02.873115+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:03.873345+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:04.873556+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:05.873785+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180273152 unmapped: 64593920 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:06.874038+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:07.874339+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:08.874611+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:09.874908+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:10.875188+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:11.875520+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:12.875813+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4d32000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:13.876018+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 180281344 unmapped: 64585728 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1861743 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f34390000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f34390780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f33e221e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:14.876227+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f3518c1e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35982c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.987571716s of 34.327899933s, submitted: 323
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35982c00 session 0x556f3660de00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f34391c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179527680 unmapped: 65339392 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35e38d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f35609680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f32e043c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:15.876399+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:16.876714+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:17.876996+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:18.877153+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1906251 data_alloc: 218103808 data_used: 2723840
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:19.877350+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:20.877539+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:21.877770+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179412992 unmapped: 65454080 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:22.877937+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35e394a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4879000/0x0/0x4ffc00000, data 0x2681544/0x2753000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179748864 unmapped: 65118208 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:23.878173+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179765248 unmapped: 65101824 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1912553 data_alloc: 218103808 data_used: 2723840
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:24.878366+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:25.878572+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:26.878689+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:27.878877+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:28.878988+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1944777 data_alloc: 218103808 data_used: 7385088
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:29.879198+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:30.879412+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:31.879582+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:32.879773+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 179716096 unmapped: 65150976 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4851000/0x0/0x4ffc00000, data 0x26a8567/0x277b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:33.879925+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.128231049s of 19.337366104s, submitted: 54
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 181485568 unmapped: 63381504 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1964585 data_alloc: 218103808 data_used: 7409664
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:34.880072+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186572800 unmapped: 58294272 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:35.880256+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186621952 unmapped: 58245120 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:36.880491+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186982400 unmapped: 57884672 heap: 244867072 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f32f4e000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f35b56d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f35e38780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f366321e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:37.880745+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3579d4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: mgrc ms_handle_reset ms_handle_reset con 0x556f377dd000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2285351161
Sep 30 19:04:02 compute-1 ceph-osd[78006]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2285351161,v1:192.168.122.100:6801/2285351161]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: get_auth_request con 0x556f35982c00 auth_method 0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: mgrc handle_mgr_configure stats_period=5
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f33c9f0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f32fc5c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f32fc41e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f359690e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:38.880975+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2190182 data_alloc: 218103808 data_used: 9437184
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2c0b000/0x0/0x4ffc00000, data 0x42ed576/0x43c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:39.881189+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2c0b000/0x0/0x4ffc00000, data 0x42ed576/0x43c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:40.881379+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:41.881562+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:42.881750+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:43.881952+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2186966 data_alloc: 218103808 data_used: 9441280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:44.882161+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:45.882389+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:46.882687+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:47.882919+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186056704 unmapped: 63004672 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:48.883109+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c21c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.168335915s of 14.641261101s, submitted: 161
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c21c00 session 0x556f3595ba40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bea000/0x0/0x4ffc00000, data 0x430e576/0x43e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186392576 unmapped: 62668800 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2189702 data_alloc: 218103808 data_used: 9441280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:49.883322+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc3000/0x0/0x4ffc00000, data 0x4335576/0x4409000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186400768 unmapped: 62660608 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:50.883528+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192626688 unmapped: 56434688 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:51.883750+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:52.883940+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:53.884108+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2290090 data_alloc: 234881024 data_used: 24068096
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:54.884241+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc0000/0x0/0x4ffc00000, data 0x4338576/0x440c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:55.884405+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:56.884685+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:57.884875+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:58.885059+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193396736 unmapped: 55664640 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2290090 data_alloc: 234881024 data_used: 24068096
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:40:59.885198+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2bc0000/0x0/0x4ffc00000, data 0x4338576/0x440c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193404928 unmapped: 55656448 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.603348732s of 11.623636246s, submitted: 4
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:00.885314+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200278016 unmapped: 48783360 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:01.885502+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198844416 unmapped: 50216960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1de5000/0x0/0x4ffc00000, data 0x5113576/0x51e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:02.885674+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198844416 unmapped: 50216960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:03.885876+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198852608 unmapped: 50208768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2405546 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:04.886084+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198852608 unmapped: 50208768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:05.886252+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:06.886476+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1de5000/0x0/0x4ffc00000, data 0x5113576/0x51e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:07.886740+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:08.886929+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:09.887075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:10.887288+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198860800 unmapped: 50200576 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:11.887490+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198868992 unmapped: 50192384 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:12.887705+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198868992 unmapped: 50192384 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:13.887967+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:14.889046+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:15.889236+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:16.889408+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:17.889663+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198877184 unmapped: 50184192 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:18.889898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2408242 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:19.890116+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.138525009s of 19.493839264s, submitted: 153
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbf000/0x0/0x4ffc00000, data 0x5138576/0x520c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:20.900548+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198885376 unmapped: 50176000 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:21.900691+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198901760 unmapped: 50159616 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:22.905022+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198901760 unmapped: 50159616 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35c365a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f359681e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:23.906220+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2406742 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:24.906739+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:25.906892+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:26.907067+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198909952 unmapped: 50151424 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:27.907323+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198918144 unmapped: 50143232 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:28.907456+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198918144 unmapped: 50143232 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2406742 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:29.907675+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198926336 unmapped: 50135040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:30.907816+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198926336 unmapped: 50135040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:31.907951+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.510564804s of 12.544851303s, submitted: 10
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198934528 unmapped: 50126848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:32.908128+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198934528 unmapped: 50126848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:33.908293+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198942720 unmapped: 50118656 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2407582 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:34.908494+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198942720 unmapped: 50118656 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:35.908799+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198950912 unmapped: 50110464 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:36.908946+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198950912 unmapped: 50110464 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:37.909191+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:38.909306+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2407582 data_alloc: 234881024 data_used: 25587712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:39.909475+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:40.909709+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198959104 unmapped: 50102272 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:41.909890+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35e39680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35e394a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198967296 unmapped: 50094080 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:42.910054+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1dbd000/0x0/0x4ffc00000, data 0x513b576/0x520f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.669682503s of 10.687623978s, submitted: 7
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35cd01e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:43.910209+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2089802 data_alloc: 218103808 data_used: 9441280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:44.910361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:45.910560+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3a37000/0x0/0x4ffc00000, data 0x34c2567/0x3595000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:46.910754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:47.910991+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:48.911164+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2089802 data_alloc: 218103808 data_used: 9441280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:49.911287+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3a37000/0x0/0x4ffc00000, data 0x34c2567/0x3595000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f32605860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 188604416 unmapped: 60456960 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3660d2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:50.911513+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184254464 unmapped: 64806912 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:51.911665+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32e05860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:52.911827+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:53.911959+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:54.913264+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:55.913497+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:56.913685+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:57.913893+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:58.914046+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:41:59.914259+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:00.914758+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:01.914940+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:02.915129+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:03.915356+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897626 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:04.915555+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4a00000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184262656 unmapped: 64798720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:05.915838+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35c363c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41c00 session 0x556f356094a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f339ccf00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f35e38960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 22.721506119s of 23.028820038s, submitted: 97
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3660de00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:06.915996+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:07.916216+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:08.916410+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:09.916681+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995012 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:10.916899+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413b000/0x0/0x4ffc00000, data 0x2dc14d3/0x2e91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:11.917137+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:12.917295+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:13.917491+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:14.917714+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1995012 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413b000/0x0/0x4ffc00000, data 0x2dc14d3/0x2e91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:15.917932+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:16.918136+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184713216 unmapped: 64348160 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.657385826s of 10.751065254s, submitted: 22
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35b561e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:17.918360+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184647680 unmapped: 64413696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:18.918497+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 184811520 unmapped: 64249856 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:19.918701+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2072750 data_alloc: 234881024 data_used: 13926400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:20.918914+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:21.919115+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:22.919290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:23.919487+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:24.919708+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2072750 data_alloc: 234881024 data_used: 13926400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:25.919952+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f413a000/0x0/0x4ffc00000, data 0x2dc14f6/0x2e92000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:26.920152+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:27.920399+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 186875904 unmapped: 62185472 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.540970802s of 11.580094337s, submitted: 10
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:28.920544+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 191766528 unmapped: 57294848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:29.920684+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192118784 unmapped: 56942592 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2133880 data_alloc: 234881024 data_used: 14221312
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3958000/0x0/0x4ffc00000, data 0x359d4f6/0x366e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:30.920857+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192118784 unmapped: 56942592 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:31.921020+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192495616 unmapped: 56565760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:32.921138+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192495616 unmapped: 56565760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f38ca000/0x0/0x4ffc00000, data 0x36284f6/0x36f9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3509c800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3509c800 session 0x556f3660cb40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:33.921231+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35cd1c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f33c9f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192503808 unmapped: 56557568 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3662ad20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35b57680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2c00 session 0x556f33c9ed20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f358885a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3580d2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f34c8eb40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:34.921397+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2238722 data_alloc: 234881024 data_used: 14036992
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:35.921659+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d60000/0x0/0x4ffc00000, data 0x4199568/0x426c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:36.921849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:37.922081+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192954368 unmapped: 56107008 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:38.922174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3595a960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:39.922380+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2237130 data_alloc: 234881024 data_used: 14036992
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3f000/0x0/0x4ffc00000, data 0x41ba568/0x428d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:40.922564+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35812c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35812c00 session 0x556f32f485a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:41.922715+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:42.922898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3579dc20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.918482780s of 14.369065285s, submitted: 147
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f3518cf00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:43.923072+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192970752 unmapped: 56090624 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35810000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:44.923220+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 192962560 unmapped: 56098816 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2239212 data_alloc: 234881024 data_used: 14045184
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:45.923386+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197615616 unmapped: 51445760 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3e000/0x0/0x4ffc00000, data 0x41ba578/0x428e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:46.923606+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:47.923877+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:48.924074+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198246400 unmapped: 50814976 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:49.924238+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2312092 data_alloc: 234881024 data_used: 24698880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:50.924439+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2d3b000/0x0/0x4ffc00000, data 0x41bd578/0x4291000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:51.924611+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:52.924834+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:53.925041+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:54.925218+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197574656 unmapped: 51486720 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2312548 data_alloc: 234881024 data_used: 24711168
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.180825233s of 12.196196556s, submitted: 4
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:55.925417+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206544896 unmapped: 42516480 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a26000/0x0/0x4ffc00000, data 0x54cc578/0x55a0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:56.925569+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205701120 unmapped: 43360256 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:57.925789+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205701120 unmapped: 43360256 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:58.925935+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f196d000/0x0/0x4ffc00000, data 0x557c578/0x5650000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:42:59.926142+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2470156 data_alloc: 234881024 data_used: 24805376
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f196d000/0x0/0x4ffc00000, data 0x557c578/0x5650000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:00.926338+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:01.926536+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:02.926748+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:03.926929+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:04.927123+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462620 data_alloc: 234881024 data_used: 24809472
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:05.927369+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:06.927559+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:07.927873+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:08.928064+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:09.928263+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462620 data_alloc: 234881024 data_used: 24809472
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:10.928508+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205824000 unmapped: 43237376 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:11.928717+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.743667603s of 17.184471130s, submitted: 187
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:12.928938+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:13.929185+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1958000/0x0/0x4ffc00000, data 0x55a0578/0x5674000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:14.929349+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462780 data_alloc: 234881024 data_used: 24797184
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:15.929511+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:16.929698+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:17.929924+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205832192 unmapped: 43229184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:18.930093+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205840384 unmapped: 43220992 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:19.930262+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205840384 unmapped: 43220992 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462780 data_alloc: 234881024 data_used: 24797184
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:20.930420+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205856768 unmapped: 43204608 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:21.930662+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35889c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063c00 session 0x556f35969860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:22.930829+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:23.931021+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:24.931219+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462428 data_alloc: 234881024 data_used: 24801280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:25.931452+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205864960 unmapped: 43196416 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:26.931601+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205873152 unmapped: 43188224 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.004239082s of 15.045738220s, submitted: 18
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:27.931817+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:28.931990+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:29.932135+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205881344 unmapped: 43180032 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462932 data_alloc: 234881024 data_used: 24801280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:30.932296+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205889536 unmapped: 43171840 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:31.932478+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205889536 unmapped: 43171840 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:32.932673+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:33.932844+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:34.933065+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205905920 unmapped: 43155456 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2462932 data_alloc: 234881024 data_used: 24801280
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1955000/0x0/0x4ffc00000, data 0x55a3578/0x5677000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x8c2f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:35.933231+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205914112 unmapped: 43147264 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:36.933374+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205914112 unmapped: 43147264 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.050238609s of 10.060816765s, submitted: 3
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35810000 session 0x556f3579cd20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f32b1f680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:37.933551+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 205922304 unmapped: 43139072 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:38.933720+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32fc5e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:39.933873+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2161623 data_alloc: 234881024 data_used: 14028800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:40.934057+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:41.934221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:42.934380+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:43.934503+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:44.934706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2161623 data_alloc: 234881024 data_used: 14028800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34390780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f35075680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:45.935575+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f349a000/0x0/0x4ffc00000, data 0x364c4f6/0x371d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 50569216 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c41000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:46.935692+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c41000 session 0x556f361fb4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:47.935894+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:48.936099+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:49.936251+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:50.936393+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:51.936537+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:52.936710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:53.936924+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:54.937079+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:55.937222+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:56.937499+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:57.937796+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:58.938384+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:43:59.938736+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:00.938907+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:01.939114+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:02.939271+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:03.939434+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:04.939654+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:05.939857+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:06.940091+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:07.940289+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:08.940478+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:09.940666+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:10.940839+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:11.941020+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:12.941190+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:13.941355+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:14.941504+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:15.941695+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:16.941849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:17.942066+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:18.942233+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:19.942467+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:20.942761+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:21.943012+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:22.943266+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:23.943468+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:24.943599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194355200 unmapped: 54706176 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:25.943814+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:26.943939+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:27.944124+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:28.944305+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:29.944527+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:30.944705+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:31.944921+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:32.945281+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f4921000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:33.945418+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194363392 unmapped: 54697984 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:34.945609+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35c363c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32e05860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f3660d2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 194371584 unmapped: 54689792 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1933157 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f35cd01e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 57.805438995s of 58.061119080s, submitted: 87
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f35e394a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35063400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35063400 session 0x556f3595ba40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f32fc41e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:35.945828+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f32fc5c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35816000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35816000 session 0x556f35969680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:36.945940+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:37.946132+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:38.946271+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:39.946437+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1965949 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:40.946575+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46de000/0x0/0x4ffc00000, data 0x240d4e3/0x24de000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:41.946772+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193544192 unmapped: 55517184 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:42.946912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3bd45800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193806336 unmapped: 55255040 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3bd45800 session 0x556f3662a5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:43.947075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193814528 unmapped: 55246848 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:44.947208+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1985095 data_alloc: 218103808 data_used: 4546560
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:45.947333+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:46.947513+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:47.947710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:48.947922+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f46b6000/0x0/0x4ffc00000, data 0x2434506/0x2506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:49.948154+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1986311 data_alloc: 218103808 data_used: 4722688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:50.948350+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193855488 unmapped: 55205888 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:51.948558+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:52.948715+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:53.948890+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 193863680 unmapped: 55197696 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:54.949047+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.013702393s of 19.293170929s, submitted: 27
Sep 30 19:04:02 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 195780608 unmapped: 53280768 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2070877 data_alloc: 218103808 data_used: 6144000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3c9f000/0x0/0x4ffc00000, data 0x2e3d506/0x2f0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x903f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:55.949174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32f55c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b81c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f35988960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f35b57c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35814400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203431936 unmapped: 45629440 heap: 249061376 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35814400 session 0x556f35e39c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34390000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32e05c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32fc4d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f33c9e5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:56.949371+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:57.949606+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:58.949868+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2260000/0x0/0x4ffc00000, data 0x36e3516/0x37b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:44:59.950081+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2146787 data_alloc: 218103808 data_used: 6115328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:00.950280+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2260000/0x0/0x4ffc00000, data 0x36e3516/0x37b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197599232 unmapped: 55140352 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:01.950684+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:02.950920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35811800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35811800 session 0x556f3580c3c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:03.951136+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:04.951326+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f32f53680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2143907 data_alloc: 218103808 data_used: 6115328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:05.951499+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2242000/0x0/0x4ffc00000, data 0x3707516/0x37da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197607424 unmapped: 55132160 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2242000/0x0/0x4ffc00000, data 0x3707516/0x37da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:06.951683+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f32604000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.983986855s of 12.415661812s, submitted: 164
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32b1f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:07.951903+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:08.952087+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2219000/0x0/0x4ffc00000, data 0x372e549/0x3803000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 197779456 unmapped: 54960128 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:09.952232+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2205390 data_alloc: 234881024 data_used: 14196736
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:10.952382+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:11.952612+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:12.952878+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2219000/0x0/0x4ffc00000, data 0x372e549/0x3803000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:13.953077+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:14.953245+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2216000/0x0/0x4ffc00000, data 0x3731549/0x3806000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2206022 data_alloc: 234881024 data_used: 14200832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:15.953505+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:16.953698+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:17.953935+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2216000/0x0/0x4ffc00000, data 0x3731549/0x3806000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198156288 unmapped: 54583296 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:18.954129+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.719206810s of 11.756697655s, submitted: 8
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202473472 unmapped: 50266112 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:19.954296+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202506240 unmapped: 50233344 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2311960 data_alloc: 234881024 data_used: 15716352
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:20.954464+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1516000/0x0/0x4ffc00000, data 0x4431549/0x4506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203333632 unmapped: 49405952 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1516000/0x0/0x4ffc00000, data 0x4431549/0x4506000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:21.954773+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f3662a1e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35b48780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203341824 unmapped: 49397760 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:22.954954+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35609860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:23.955112+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:24.955294+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2098061 data_alloc: 218103808 data_used: 6115328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:25.955486+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f15a6000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198361088 unmapped: 54378496 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:26.955732+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:27.955957+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2a3c000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:28.956157+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f2a3c000/0x0/0x4ffc00000, data 0x2f0e506/0x2fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:29.956327+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2088449 data_alloc: 218103808 data_used: 6115328
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:30.956516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f361fad20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.614155769s of 12.000268936s, submitted: 163
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3594a960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198393856 unmapped: 54345728 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:31.956740+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34c8f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:32.956891+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:33.957056+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:34.957246+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:35.957456+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:36.957603+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:37.957840+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:38.958009+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:39.958250+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:40.958474+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:41.958683+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:42.958910+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196616192 unmapped: 56123392 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:43.959128+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:44.959324+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:45.959565+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196624384 unmapped: 56115200 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:46.959727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:47.959924+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:48.960104+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:49.960290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:50.960526+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196632576 unmapped: 56107008 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:51.960798+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:52.961045+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:53.961276+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:54.961490+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:55.961751+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:56.961985+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196640768 unmapped: 56098816 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:57.962170+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:58.962358+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:45:59.962561+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:00.962766+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:01.962974+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:02.963161+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:03.963336+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:04.963534+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:05.963685+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:06.963856+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:07.964098+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:08.964359+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:09.964523+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:10.964704+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:11.964935+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:12.965176+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:13.965380+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:14.965611+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:15.965915+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:16.966130+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196648960 unmapped: 56090624 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:17.966401+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:18.966605+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:19.966856+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:20.967142+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:21.967371+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:22.967677+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:23.967912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:24.968112+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196657152 unmapped: 56082432 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:25.968289+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:26.968452+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:27.968704+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:28.968869+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:29.969067+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:30.969241+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:31.969464+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f3781000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa1df9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:32.969754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:33.969993+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196665344 unmapped: 56074240 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:34.970173+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196673536 unmapped: 56066048 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:35.970409+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 196673536 unmapped: 56066048 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1959085 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f339cf800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 65.057006836s of 65.169685364s, submitted: 38
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f339cf800 session 0x556f3580cb40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:36.970710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:37.970958+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1ebb000/0x0/0x4ffc00000, data 0x28f14d3/0x29c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:38.971125+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:39.971328+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198975488 unmapped: 53764096 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34390960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:40.971552+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2016021 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:41.971727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1ebb000/0x0/0x4ffc00000, data 0x28f14d3/0x29c1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b57e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:42.971909+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198983680 unmapped: 53755904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35074780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f361fa000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:43.972219+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35175800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3a489400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:44.972415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:45.972599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 199319552 unmapped: 53420032 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2046651 data_alloc: 218103808 data_used: 6647808
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:46.972900+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:47.973117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1e93000/0x0/0x4ffc00000, data 0x29184e3/0x29e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:48.973388+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:49.973551+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:50.973835+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2062763 data_alloc: 218103808 data_used: 9084928
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:51.974014+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:52.974273+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200007680 unmapped: 52731904 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1e93000/0x0/0x4ffc00000, data 0x29184e3/0x29e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:53.974487+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200015872 unmapped: 52723712 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:54.974685+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200015872 unmapped: 52723712 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.052993774s of 19.121305466s, submitted: 15
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:55.974927+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202653696 unmapped: 50085888 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2145111 data_alloc: 218103808 data_used: 9428992
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:56.975222+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202661888 unmapped: 50077696 heap: 252739584 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e08c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e08c00 session 0x556f3518c960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f34391c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b49e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3595b680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f366325a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1411000/0x0/0x4ffc00000, data 0x339851c/0x346b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32d00400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32d00400 session 0x556f3518d4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f33e23e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35e38780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3662b4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:57.976116+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:58.976407+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:46:59.976569+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:00.976766+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2216636 data_alloc: 218103808 data_used: 9961472
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:01.976987+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f34390780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:02.977223+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:03.977498+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c5b000/0x0/0x4ffc00000, data 0x3b4e555/0x3c21000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:04.977764+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e0bc00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e0bc00 session 0x556f3660d680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:05.977999+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 202899456 unmapped: 57188352 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2216636 data_alloc: 218103808 data_used: 9961472
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35969860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.743741989s of 11.034352303s, submitted: 93
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f361fb4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:06.978145+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203079680 unmapped: 57008128 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:07.978343+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203079680 unmapped: 57008128 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:08.978509+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 203931648 unmapped: 56156160 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:09.978724+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206069760 unmapped: 54018048 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:10.978891+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206069760 unmapped: 54018048 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2265890 data_alloc: 234881024 data_used: 16728064
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:11.979096+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:12.979311+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:13.979513+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:14.979725+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206077952 unmapped: 54009856 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:15.979980+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2265890 data_alloc: 234881024 data_used: 16728064
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:16.980166+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c33000/0x0/0x4ffc00000, data 0x3b75565/0x3c49000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:17.980383+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206086144 unmapped: 54001664 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.923741341s of 11.933808327s, submitted: 2
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:18.980552+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 207986688 unmapped: 52101120 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:19.980689+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f01c8000/0x0/0x4ffc00000, data 0x45df565/0x46b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:20.980843+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2360498 data_alloc: 234881024 data_used: 17391616
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0153000/0x0/0x4ffc00000, data 0x4654565/0x4728000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:21.981006+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:22.981145+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:23.981354+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:24.981511+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:25.981781+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357026 data_alloc: 234881024 data_used: 17395712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0135000/0x0/0x4ffc00000, data 0x4673565/0x4747000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:26.982002+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0135000/0x0/0x4ffc00000, data 0x4673565/0x4747000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:27.982257+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:28.982475+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:29.982754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209838080 unmapped: 50249728 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:30.982989+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.518098831s of 12.787870407s, submitted: 84
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0133000/0x0/0x4ffc00000, data 0x4674565/0x4748000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357930 data_alloc: 234881024 data_used: 17395712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:31.983138+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:32.983281+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:33.983503+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:34.983737+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:35.984012+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2357930 data_alloc: 234881024 data_used: 17395712
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:36.984171+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f012a000/0x0/0x4ffc00000, data 0x467e565/0x4752000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210935808 unmapped: 49152000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:37.984362+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f012a000/0x0/0x4ffc00000, data 0x467e565/0x4752000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209903616 unmapped: 50184192 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:38.984544+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35175800 session 0x556f32f530e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3a489400 session 0x556f35cd01e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:39.984689+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:40.984851+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:41.985018+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:42.985221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209911808 unmapped: 50176000 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:43.985401+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:44.985582+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:45.985764+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209920000 unmapped: 50167808 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:46.985933+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209928192 unmapped: 50159616 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:47.986481+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209928192 unmapped: 50159616 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:48.986932+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:49.987096+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.184453964s of 19.200132370s, submitted: 4
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:50.987590+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358122 data_alloc: 234881024 data_used: 17403904
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:51.987809+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0127000/0x0/0x4ffc00000, data 0x4681565/0x4755000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4200.1 total, 600.0 interval
                                           Cumulative writes: 22K writes, 87K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.02 MB/s
                                           Cumulative WAL: 22K writes, 7266 syncs, 3.10 writes per sync, written: 0.08 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3577 writes, 13K keys, 3577 commit groups, 1.0 writes per commit group, ingest: 15.54 MB, 0.03 MB/s
                                           Interval WAL: 3577 writes, 1387 syncs, 2.58 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:52.988167+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:53.988523+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209936384 unmapped: 50151424 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:54.988826+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209944576 unmapped: 50143232 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:55.989075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f3660de00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35b56d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2358050 data_alloc: 234881024 data_used: 17403904
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209944576 unmapped: 50143232 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:56.989242+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f3662a1e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:57.989470+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:58.989708+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:47:59.989894+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:00.990032+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2166067 data_alloc: 218103808 data_used: 9969664
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:01.990182+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:02.990337+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:03.990526+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:04.990710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f32f4f4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 206503936 unmapped: 53583872 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f3660dc20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f140f000/0x0/0x4ffc00000, data 0x33994e3/0x346a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:05.990842+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2166067 data_alloc: 218103808 data_used: 9969664
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.943166733s of 16.025777817s, submitted: 27
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200450048 unmapped: 59637760 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b56780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:06.991021+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:07.991232+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:08.991419+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:09.991595+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:10.991801+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:11.991954+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:12.992146+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:13.992302+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:14.992604+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:15.992853+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:16.993114+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:17.993365+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:18.993566+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:19.993707+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:20.993876+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:21.994121+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:22.994364+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:23.994516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:24.994716+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:25.994918+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Sep 30 19:04:02 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/209043629' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:26.995126+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:27.995363+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:28.995537+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:29.995705+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:30.995849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1984106 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:31.996032+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:32.996199+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25e2000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:33.996404+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:34.996688+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f3518d860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f32f46d20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f35c363c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35cd03c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198492160 unmapped: 61595648 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3b21c800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 29.204212189s of 29.241395950s, submitted: 14
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3b21c800 session 0x556f35b81e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f3518d0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f32fe7e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40000 session 0x556f32337e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:35.996866+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34cd2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34cd2000 session 0x556f35609680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2073535 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:36.997050+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:37.997271+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:38.997496+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9f000/0x0/0x4ffc00000, data 0x2d0b545/0x2ddd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:39.997714+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:40.997902+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2073535 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:41.998078+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198729728 unmapped: 61358080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:42.998249+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198737920 unmapped: 61349888 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:43.998466+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34c40800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f34c40800 session 0x556f35b80b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9f000/0x0/0x4ffc00000, data 0x2d0b545/0x2ddd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198746112 unmapped: 61341696 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:44.998739+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 198754304 unmapped: 61333504 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:45.998910+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2130217 data_alloc: 218103808 data_used: 10846208
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 200564736 unmapped: 59523072 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:46.999075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:47.999268+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:48.999494+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:49.999771+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:50.999954+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2154233 data_alloc: 234881024 data_used: 14422016
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:52.000158+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:53.000394+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:54.000727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201662464 unmapped: 58425344 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:55.000942+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f1a9e000/0x0/0x4ffc00000, data 0x2d0b568/0x2dde000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.785074234s of 19.939979553s, submitted: 43
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 201998336 unmapped: 58089472 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:56.001122+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2276211 data_alloc: 234881024 data_used: 15269888
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 209100800 unmapped: 50987008 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:57.001291+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 208961536 unmapped: 51126272 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:58.001524+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 208969728 unmapped: 51118080 heap: 260087808 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1f000/0x0/0x4ffc00000, data 0x3b8a568/0x3c5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:48:59.001728+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c43400 session 0x556f3660d0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:00.001889+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb93000/0x0/0x4ffc00000, data 0x4c16568/0x4ce9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:01.002065+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2409753 data_alloc: 234881024 data_used: 15323136
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:02.002307+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210247680 unmapped: 58818560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:03.002549+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb93000/0x0/0x4ffc00000, data 0x4c16568/0x4ce9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210264064 unmapped: 58802176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:04.002722+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210264064 unmapped: 58802176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:05.002917+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210272256 unmapped: 58793984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:06.003096+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2409769 data_alloc: 234881024 data_used: 15323136
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210272256 unmapped: 58793984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:07.003344+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.529235840s of 11.966050148s, submitted: 150
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f3580f000 session 0x556f35b56b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:08.003610+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:09.003852+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 210444288 unmapped: 58621952 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:10.004040+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32e08400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 216334336 unmapped: 52731904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:11.004194+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2527582 data_alloc: 251658240 data_used: 32378880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:12.004353+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:13.004576+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:14.004748+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:15.005119+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:16.005322+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2527582 data_alloc: 251658240 data_used: 32378880
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:17.005482+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222273536 unmapped: 46792704 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:18.005742+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222289920 unmapped: 46776320 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:19.005871+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222289920 unmapped: 46776320 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:20.006064+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4efb6b000/0x0/0x4ffc00000, data 0x4c3d58b/0x4d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222314496 unmapped: 46751744 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:21.006281+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.418137550s of 13.440409660s, submitted: 5
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2630538 data_alloc: 251658240 data_used: 33140736
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 224985088 unmapped: 44081152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:22.006559+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 225026048 unmapped: 44040192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:23.006715+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:24.006892+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:25.007064+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:26.007258+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eeddc000/0x0/0x4ffc00000, data 0x59b458b/0x5a88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2640042 data_alloc: 251658240 data_used: 33394688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:27.007413+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226516992 unmapped: 42549248 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:28.007609+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:29.007819+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:30.008917+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:31.009117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635202 data_alloc: 251658240 data_used: 33394688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:32.009335+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226557952 unmapped: 42508288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:33.009543+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd1000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:34.009725+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:35.009862+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.498695374s of 14.815695763s, submitted: 151
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226566144 unmapped: 42500096 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:36.010012+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635334 data_alloc: 251658240 data_used: 33394688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:37.010144+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35074960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f35b57a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:38.010344+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedd2000/0x0/0x4ffc00000, data 0x59d658b/0x5aaa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:39.010508+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:40.010712+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226574336 unmapped: 42491904 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:41.010856+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2635642 data_alloc: 251658240 data_used: 33394688
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226590720 unmapped: 42475520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:42.011026+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226590720 unmapped: 42475520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:43.011223+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [0,0,0,13])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226631680 unmapped: 42434560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:44.011391+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226754560 unmapped: 42311680 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:45.011533+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:46.011682+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2637714 data_alloc: 251658240 data_used: 33382400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:47.011823+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:48.012021+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:49.012241+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226811904 unmapped: 42254336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:50.012464+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4eedcf000/0x0/0x4ffc00000, data 0x59d958b/0x5aad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32e08400 session 0x556f35b57e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35102c00 session 0x556f35cba000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 226820096 unmapped: 42246144 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:51.012608+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.823111534s of 15.860321999s, submitted: 287
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2302666 data_alloc: 234881024 data_used: 15433728
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219635712 unmapped: 49430528 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f32619c00 session 0x556f35b490e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:52.012912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:53.013077+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1e000/0x0/0x4ffc00000, data 0x3b8b568/0x3c5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:54.013219+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f0c1e000/0x0/0x4ffc00000, data 0x3b8b568/0x3c5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:55.013371+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:56.013525+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2295426 data_alloc: 234881024 data_used: 15310848
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:57.013716+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219643904 unmapped: 49422336 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:58.013911+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f35c35c00 session 0x556f33c9f0e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f38302400 session 0x556f361fbe00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219586560 unmapped: 49479680 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:49:59.014062+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f330cac00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211197952 unmapped: 57868288 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:00.014214+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 ms_handle_reset con 0x556f330cac00 session 0x556f34c8eb40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4f6/0x229b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:01.014367+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:02.014531+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:03.014705+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:04.014914+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:05.015132+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:06.015345+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:07.015552+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:08.015813+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:09.016050+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:10.016265+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:11.016478+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:12.016732+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:13.016926+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:14.017208+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:15.017452+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:16.017847+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211222528 unmapped: 57843712 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:17.018002+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:18.018219+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:19.018414+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:20.018578+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:21.018789+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:22.019031+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:23.019237+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:24.019425+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211230720 unmapped: 57835520 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:25.019605+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:26.019860+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:27.020023+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:28.020257+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:29.020517+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:30.020712+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:31.020905+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:32.021165+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:33.021314+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211238912 unmapped: 57827328 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:34.021523+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:35.021741+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:36.021907+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:37.022117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:38.022374+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:39.022551+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:40.022772+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:41.022961+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:42.023202+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:43.023414+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:44.023669+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:45.023834+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:46.024006+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 heartbeat osd_stat(store_statfs(0x4f25dc000/0x0/0x4ffc00000, data 0x21ca4d3/0x229a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:47.024211+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018980 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:48.024504+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211247104 unmapped: 57819136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _renew_subs
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 57.031288147s of 57.246501923s, submitted: 78
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:49.024675+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211271680 unmapped: 57794560 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 145 ms_handle_reset con 0x556f32619c00 session 0x556f35b483c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25de000/0x0/0x4ffc00000, data 0x21cc3db/0x229d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:50.024862+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:51.025048+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:52.025239+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2026892 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:53.025451+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:54.025686+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 145 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:55.025910+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211296256 unmapped: 57769984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:56.026110+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211296256 unmapped: 57769984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: get_auth_request con 0x556f32374800 auth_method 0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f25da000/0x0/0x4ffc00000, data 0x21ce347/0x22a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:57.026277+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2029650 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:58.026466+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 211288064 unmapped: 57778176 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f35e39680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f32fc5e00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f32fc41e0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35608f00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.830822945s of 10.008426666s, submitted: 80
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35ef1a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:50:59.026650+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35e392c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f32f55860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 56057856 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f35ef0b40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f3594b680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d3816e/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:00.026803+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d381a7/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:01.027032+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a6e000/0x0/0x4ffc00000, data 0x2d381a7/0x2e0e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:02.027217+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2131244 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f35cbaf00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:03.027424+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:04.027727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:05.027939+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3580d2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:06.028185+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 56049664 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f34390960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f3579cf00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:07.028348+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2134979 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213295104 unmapped: 55771136 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:08.028529+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:09.028724+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:10.028919+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:11.029078+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:12.029269+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f34380800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.129595757s of 13.232490540s, submitted: 27
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f34380800 session 0x556f33e23680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2207837 data_alloc: 234881024 data_used: 13172736
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:13.029422+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:14.029611+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1a46000/0x0/0x4ffc00000, data 0x2d5f1b7/0x2e36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb37f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:15.029858+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:16.030053+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:17.030202+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2207837 data_alloc: 234881024 data_used: 13172736
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:18.030444+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213303296 unmapped: 55762944 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:19.030643+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 217145344 unmapped: 51920896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:20.030800+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f03fa000/0x0/0x4ffc00000, data 0x3f8d1b7/0x4064000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 221618176 unmapped: 47448064 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:21.030963+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:22.031169+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364737 data_alloc: 234881024 data_used: 15335424
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:23.031394+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:24.031538+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0373000/0x0/0x4ffc00000, data 0x401c1b7/0x40f3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:25.031742+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:26.031923+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.050240517s of 14.504535675s, submitted: 178
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:27.032103+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2362633 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f035a000/0x0/0x4ffc00000, data 0x403b1b7/0x4112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:28.032311+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f32f48000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:29.032449+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222863360 unmapped: 46202880 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:30.032675+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f035a000/0x0/0x4ffc00000, data 0x403b1b7/0x4112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:31.032848+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:32.033003+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364107 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:33.033144+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0350000/0x0/0x4ffc00000, data 0x40451b7/0x411c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:34.033346+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222871552 unmapped: 46194688 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:35.033567+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222887936 unmapped: 46178304 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35b57c20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:36.033746+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222896128 unmapped: 46170112 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f0350000/0x0/0x4ffc00000, data 0x40451b7/0x411c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:37.033920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364341 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222896128 unmapped: 46170112 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.793055534s of 10.837422371s, submitted: 11
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f33ec2000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38303800 session 0x556f35c363c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:38.034124+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:39.034318+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:40.034498+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:41.034752+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:42.035006+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364141 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222912512 unmapped: 46153728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:43.035194+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222920704 unmapped: 46145536 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:44.035391+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:45.035592+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:46.035850+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:47.036017+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364141 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222928896 unmapped: 46137344 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:48.036223+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:49.036407+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:50.036560+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:51.036776+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222937088 unmapped: 46129152 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.185568810s of 14.251283646s, submitted: 19
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [0,0,0,1])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:52.037008+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364477 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:53.037214+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f3594ba40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:54.037430+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:55.037713+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222945280 unmapped: 46120960 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:56.037888+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:57.038071+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364253 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:58.038340+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:51:59.038553+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222953472 unmapped: 46112768 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:00.038844+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:01.039076+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:02.039285+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2364253 data_alloc: 234881024 data_used: 15339520
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:03.039388+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:04.039577+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:05.039739+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:06.039983+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222961664 unmapped: 46104576 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:07.040198+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.336604118s of 15.361575127s, submitted: 6
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2366085 data_alloc: 234881024 data_used: 15327232
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:08.040456+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:09.040659+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:10.040862+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:11.041088+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:12.041252+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2366085 data_alloc: 234881024 data_used: 15327232
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:13.041446+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222969856 unmapped: 46096384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:14.041688+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222978048 unmapped: 46088192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:15.041879+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 222978048 unmapped: 46088192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f33e22960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:16.042017+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f35e38780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f034d000/0x0/0x4ffc00000, data 0x40481b7/0x411f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:17.042209+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2363141 data_alloc: 234881024 data_used: 15327232
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:18.042455+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.912950516s of 10.993530273s, submitted: 31
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f34c8e5a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f3660d680
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 223002624 unmapped: 46063616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35102c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:19.042679+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35102c00 session 0x556f3518d4a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:20.042927+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c7000/0x0/0x4ffc00000, data 0x21d0135/0x22a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:21.043101+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:22.043337+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2052178 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:02 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:02 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:02.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:23.043479+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:24.043734+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214646784 unmapped: 54419456 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3595b860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:25.043885+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35ef0960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:26.044096+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:27.044397+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:28.044606+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:29.044760+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:30.044923+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:31.045091+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:32.045296+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:33.045487+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:34.045712+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:35.045913+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:36.046131+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:37.046313+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2051182 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:38.046537+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:39.046763+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:40.046964+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:41.047154+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f21c9000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 214654976 unmapped: 54411264 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:42.047338+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.189771652s of 24.371664047s, submitted: 64
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f3662ab40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2138052 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:43.047516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:44.047699+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:45.047920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:46.048121+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:47.048314+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f32f46000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2138052 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:48.048585+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:49.048815+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:50.049043+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38302400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38302400 session 0x556f35ef0780
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:51.049250+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1684000/0x0/0x4ffc00000, data 0x2d15125/0x2de8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f3662a000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215228416 unmapped: 53837824 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f35b56f00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:52.049576+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c35c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f35c43400
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.939822197s of 10.048395157s, submitted: 25
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2140807 data_alloc: 218103808 data_used: 2723840
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:53.049741+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:54.049859+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:55.049982+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:56.050108+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f38303800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f38303800 session 0x556f33c9f2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:57.050266+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2214633 data_alloc: 234881024 data_used: 13324288
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:58.050477+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:52:59.050668+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1683000/0x0/0x4ffc00000, data 0x2d15135/0x2de9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xb78f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215187456 unmapped: 53878784 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:00.050848+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:01.051020+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:02.051177+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2214633 data_alloc: 234881024 data_used: 13324288
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:03.051315+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.502856255s of 10.511325836s, submitted: 2
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 215195648 unmapped: 53870592 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:04.051443+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219193344 unmapped: 49872896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:05.051592+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f19a9000/0x0/0x4ffc00000, data 0x3a2f135/0x3b03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 219193344 unmapped: 49872896 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:06.051839+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:07.052031+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:08.052261+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:09.052454+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:10.052736+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:11.052915+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:12.053127+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:13.053333+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:14.053500+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:15.053672+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:16.053853+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:17.054027+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:18.054284+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:19.054476+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220086272 unmapped: 48979968 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:20.054694+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:21.054863+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:22.055029+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321329 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:23.055225+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:24.055411+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:25.055581+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220094464 unmapped: 48971776 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:26.055728+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.446918488s of 23.703081131s, submitted: 110
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33efb800 session 0x556f35ef1a40
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:27.055944+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321197 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:28.056155+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:29.056352+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:30.056520+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:31.056666+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:32.056817+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2321197 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220102656 unmapped: 48963584 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:33.056966+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1992000/0x0/0x4ffc00000, data 0x3a45135/0x3b19000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220110848 unmapped: 48955392 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:34.057114+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220119040 unmapped: 48947200 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:35.057329+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 220119040 unmapped: 48947200 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:36.057570+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.092041016s of 10.097191811s, submitted: 1
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:37.057773+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:38.058039+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:39.058262+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:40.058386+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:41.058600+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218873856 unmapped: 50192384 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:42.058881+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218882048 unmapped: 50184192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:43.059046+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218882048 unmapped: 50184192 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:44.059229+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:45.059395+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:46.059595+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:47.059823+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f1991000/0x0/0x4ffc00000, data 0x3a46135/0x3b1a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2318125 data_alloc: 234881024 data_used: 14573568
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:48.060020+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218890240 unmapped: 50176000 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3509ec00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.590891838s of 11.599750519s, submitted: 3
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3509ec00 session 0x556f3662b860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32619c00 session 0x556f343914a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:49.060167+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:50.060366+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c35c00 session 0x556f32f4ed20
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f35c43400 session 0x556f342f9860
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:51.060534+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 218906624 unmapped: 50159616 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3208000/0x0/0x4ffc00000, data 0x21d0135/0x22a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33efb800 session 0x556f3660c960
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:52.060714+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:53.060898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:54.061117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:55.061312+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:56.061467+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:57.061741+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:58.061966+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:53:59.062182+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.613150597s of 10.763535500s, submitted: 47
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f3580f000 session 0x556f36632f00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:00.062424+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:01.062669+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:02.062841+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:03.063005+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:04.063230+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:05.063459+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:06.063702+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:07.063949+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:08.064179+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:09.064383+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:10.064533+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:11.064746+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:12.064906+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:13.065059+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:14.065289+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:15.065525+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213344256 unmapped: 55721984 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:16.065701+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:17.065857+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:18.066045+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:19.066215+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:20.066339+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:21.066524+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:22.066696+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:23.066842+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213352448 unmapped: 55713792 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:24.067011+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:25.067118+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:26.067324+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:27.067498+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:28.067696+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:29.067866+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:30.068065+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 55705600 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:31.068263+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:32.068433+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32d01c00 session 0x556f32f545a0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f3580f000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f32529000 session 0x556f35b48000
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f32619c00
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:33.068598+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:34.068824+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 ms_handle_reset con 0x556f33d37400 session 0x556f3595b2c0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: handle_auth_request added challenge on 0x556f33efb800
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:35.069015+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:36.069217+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213368832 unmapped: 55697408 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:37.069404+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:38.069757+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:39.070002+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:40.070174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:41.070251+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:42.070428+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:43.070576+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213377024 unmapped: 55689216 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:44.070773+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:45.071038+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:46.071227+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:47.071395+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:48.071613+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:49.071834+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213385216 unmapped: 55681024 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:50.072025+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:51.072224+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:52.072466+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:53.072692+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:54.072923+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:55.073118+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:56.073338+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:57.073532+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213393408 unmapped: 55672832 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:58.073749+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:54:59.073913+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:00.074081+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:01.074284+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:02.074506+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 55664640 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:03.074706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:04.074884+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:05.075080+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213409792 unmapped: 55656448 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:06.075308+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:07.075472+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:08.075722+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:09.075889+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:10.076062+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 55648256 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:11.076221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 55369728 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:12.076367+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212942848 unmapped: 56123392 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:13.076500+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 56098816 heap: 269066240 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'log dump' '{prefix=log dump}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:14.076696+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf dump' '{prefix=perf dump}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf schema' '{prefix=perf schema}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:15.076864+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212934656 unmapped: 67174400 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:16.077019+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212934656 unmapped: 67174400 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:17.077186+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212934656 unmapped: 67174400 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:18.077358+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212934656 unmapped: 67174400 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:19.077489+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212934656 unmapped: 67174400 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:20.077644+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212942848 unmapped: 67166208 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:21.077754+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212942848 unmapped: 67166208 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:22.077917+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:23.078075+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:24.078254+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:25.078395+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:26.078553+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:27.078694+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212951040 unmapped: 67158016 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:28.078855+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:29.079009+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:30.079137+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:31.079309+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:32.079461+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:33.079658+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:34.079785+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:35.079943+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212959232 unmapped: 67149824 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:36.080088+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:37.080241+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:38.080421+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:39.080598+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:40.080780+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:41.080930+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:42.081111+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:43.081254+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:44.081392+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212967424 unmapped: 67141632 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:45.081556+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:46.081704+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:47.081866+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:48.082073+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:49.082177+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:50.082309+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212975616 unmapped: 67133440 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:51.082475+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212983808 unmapped: 67125248 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:52.082689+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212983808 unmapped: 67125248 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:53.082868+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212983808 unmapped: 67125248 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:54.083015+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:55.083174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:56.083399+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:57.083656+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:58.083873+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:55:59.084080+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:00.084244+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:01.084407+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 212992000 unmapped: 67117056 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:02.084674+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213000192 unmapped: 67108864 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:03.084875+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213000192 unmapped: 67108864 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:04.085073+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213000192 unmapped: 67108864 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:05.085274+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213000192 unmapped: 67108864 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:06.085490+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213000192 unmapped: 67108864 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:07.085691+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:08.085937+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:09.086110+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:10.086247+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:11.086412+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:12.086613+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213008384 unmapped: 67100672 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:13.086867+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:14.087112+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:15.087279+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:16.087506+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:17.087723+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:18.087914+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213016576 unmapped: 67092480 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:19.088128+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:20.088323+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:21.088522+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:22.088722+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:23.088924+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:24.089094+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213024768 unmapped: 67084288 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:25.089276+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:26.089449+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:27.089611+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:28.089910+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:29.090118+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:30.090350+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:31.090543+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213032960 unmapped: 67076096 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:32.090760+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213041152 unmapped: 67067904 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:33.090977+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213049344 unmapped: 67059712 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:34.091161+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213049344 unmapped: 67059712 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:35.091346+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213049344 unmapped: 67059712 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:36.091542+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213049344 unmapped: 67059712 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:37.091749+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213049344 unmapped: 67059712 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:38.091995+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:39.092144+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:40.092326+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:41.092511+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:42.092675+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:43.092840+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:44.092993+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:45.093147+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:46.093301+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:47.093467+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213057536 unmapped: 67051520 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:48.093693+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213065728 unmapped: 67043328 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:49.094232+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213073920 unmapped: 67035136 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:50.094531+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213073920 unmapped: 67035136 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:51.094724+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213073920 unmapped: 67035136 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:52.094893+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213073920 unmapped: 67035136 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:53.095054+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213073920 unmapped: 67035136 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:54.095212+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:55.095350+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:56.095580+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:57.095781+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:58.095964+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:56:59.096175+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:00.096384+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:01.096541+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:02.096740+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:03.096986+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:04.097132+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:05.097333+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213082112 unmapped: 67026944 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:06.097548+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:07.097718+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:08.097933+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:09.098108+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:10.098330+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:11.098524+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:12.098676+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:13.098916+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:14.099116+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:15.099327+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:16.099531+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213090304 unmapped: 67018752 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:17.099720+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213098496 unmapped: 67010560 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:18.099888+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:19.100054+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:20.100218+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:21.100415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:22.100679+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:23.101659+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:24.101899+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:25.102058+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:26.102266+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:27.102400+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213106688 unmapped: 67002368 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:28.102588+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:29.102764+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:30.102916+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:31.103081+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:32.103236+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:33.103388+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213114880 unmapped: 66994176 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:34.103587+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:35.103855+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:36.104006+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:37.104200+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:38.104415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:39.104570+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:40.104745+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:41.104957+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:42.105172+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213123072 unmapped: 66985984 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:43.105335+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213131264 unmapped: 66977792 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:44.105593+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213131264 unmapped: 66977792 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:45.105848+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213139456 unmapped: 66969600 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:46.106025+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213139456 unmapped: 66969600 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:47.106239+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213139456 unmapped: 66969600 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:48.106501+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213139456 unmapped: 66969600 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:49.106695+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213139456 unmapped: 66969600 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:50.106872+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:51.107067+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:52.107290+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4800.1 total, 600.0 interval
                                           Cumulative writes: 24K writes, 95K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.02 MB/s
                                           Cumulative WAL: 24K writes, 8145 syncs, 3.04 writes per sync, written: 0.09 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2237 writes, 8429 keys, 2237 commit groups, 1.0 writes per commit group, ingest: 9.64 MB, 0.02 MB/s
                                           Interval WAL: 2237 writes, 879 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:53.107513+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:54.107710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:55.107906+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:56.108167+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:57.108383+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:58.108825+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:57:59.109215+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:00.109556+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213147648 unmapped: 66961408 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:01.109872+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:02.110084+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:03.110325+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:04.111289+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:05.111898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:06.112408+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:07.112800+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:08.113216+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213155840 unmapped: 66953216 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:09.113549+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213164032 unmapped: 66945024 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:10.113858+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:11.114104+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:12.114678+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:13.114920+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:14.115236+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:15.115503+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:16.115731+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:17.115943+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:18.116224+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213172224 unmapped: 66936832 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:19.116390+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:20.116576+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:21.116743+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:22.116913+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:23.117113+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:24.117299+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:25.117471+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213180416 unmapped: 66928640 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:26.117717+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213188608 unmapped: 66920448 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:27.118441+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:28.119237+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:29.119511+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:30.119802+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:31.120397+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:32.120869+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:33.121239+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213196800 unmapped: 66912256 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:34.121581+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:35.121901+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:36.122131+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:37.122531+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:38.122912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:39.123155+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:40.123308+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:41.123779+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213204992 unmapped: 66904064 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:42.124057+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:43.124326+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:44.124525+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:45.125118+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:46.125440+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:47.125749+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:48.126054+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:49.126221+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:50.126463+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:51.126706+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:52.126922+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213213184 unmapped: 66895872 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:53.127152+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213221376 unmapped: 66887680 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:54.127441+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213221376 unmapped: 66887680 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:55.127935+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213221376 unmapped: 66887680 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:56.128167+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213221376 unmapped: 66887680 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:57.128369+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213221376 unmapped: 66887680 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:58.128733+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213229568 unmapped: 66879488 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:58:59.129081+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:00.129352+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:01.129734+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:02.130714+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:03.130875+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:04.131174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:05.131395+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:06.131781+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:07.132265+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:08.132558+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:09.132758+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:10.132910+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:11.133052+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:12.133263+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:13.133463+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213237760 unmapped: 66871296 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:14.133723+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213245952 unmapped: 66863104 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:15.133898+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213245952 unmapped: 66863104 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:16.134092+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213245952 unmapped: 66863104 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:17.134286+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213245952 unmapped: 66863104 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:18.134527+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213254144 unmapped: 66854912 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:19.134718+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213254144 unmapped: 66854912 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:20.134895+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:21.135058+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:22.135295+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:23.135479+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:24.135696+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:25.135903+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:26.136131+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:27.136278+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213262336 unmapped: 66846720 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:28.136461+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:29.136698+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:30.136943+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:31.137132+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:32.137303+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:33.137514+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213270528 unmapped: 66838528 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:34.137701+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213278720 unmapped: 66830336 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:35.137918+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213278720 unmapped: 66830336 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:36.139516+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213286912 unmapped: 66822144 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:37.139707+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213286912 unmapped: 66822144 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:38.140027+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213286912 unmapped: 66822144 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:39.140934+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213286912 unmapped: 66822144 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:40.141195+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213286912 unmapped: 66822144 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:41.141371+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 342.442565918s of 342.446807861s, submitted: 1
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213311488 unmapped: 66797568 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:42.141584+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 66748416 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:43.141880+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213360640 unmapped: 66748416 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:44.142750+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213417984 unmapped: 66691072 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:45.142929+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213508096 unmapped: 66600960 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:46.143117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:47.143399+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:48.143726+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:49.144005+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:50.144202+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:51.144455+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:52.144747+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:53.145003+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:54.145317+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:55.145559+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:56.145862+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:57.146178+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:58.146554+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T18:59:59.146832+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:00.147031+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:01.147359+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213540864 unmapped: 66568192 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:02.148136+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:03.148407+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:04.148764+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:05.148953+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:06.149162+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:07.149710+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:08.150484+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:09.150938+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213549056 unmapped: 66560000 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:10.151564+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213557248 unmapped: 66551808 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:11.152058+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213557248 unmapped: 66551808 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:12.152478+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213557248 unmapped: 66551808 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:13.152693+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213557248 unmapped: 66551808 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:14.153026+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:15.153393+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:16.153696+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:17.153903+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:18.154162+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:19.154361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:20.154651+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:21.154897+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:22.155117+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:23.155321+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213565440 unmapped: 66543616 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:24.155599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:25.155951+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:26.156118+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:27.156361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:28.156601+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:29.156798+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:30.156999+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:31.157206+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:32.157416+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:33.157602+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:34.157893+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:35.158090+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:36.158278+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:37.158493+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:38.158771+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:39.158939+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:40.159140+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:41.159326+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:42.159475+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:43.159697+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:44.159843+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:45.160037+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:46.160270+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:47.160535+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:48.160767+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:49.160993+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:50.161174+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:51.161365+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:52.161581+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:53.161807+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213573632 unmapped: 66535424 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:54.162024+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:55.162307+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:56.162526+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:57.162761+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:58.163021+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:00:59.163246+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:00.163410+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:01.163713+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213581824 unmapped: 66527232 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:02.163891+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:03.164140+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:04.164354+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:05.164849+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:06.165162+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:07.165418+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:08.165736+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:09.165963+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213590016 unmapped: 66519040 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:10.166135+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:11.166300+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:12.166530+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:13.166726+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:14.166880+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:15.167062+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:16.167210+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:17.167351+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:18.167576+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213598208 unmapped: 66510848 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:19.167780+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213606400 unmapped: 66502656 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:20.167964+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213606400 unmapped: 66502656 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:21.168144+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213606400 unmapped: 66502656 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:22.168349+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213606400 unmapped: 66502656 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:23.168526+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213606400 unmapped: 66502656 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:24.168735+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:25.168955+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:26.169169+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:27.169361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:28.169599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:29.169873+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:30.170049+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:31.170229+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:32.170465+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:33.170719+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213614592 unmapped: 66494464 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:34.170942+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:35.171178+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:36.171478+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:37.171716+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:38.171957+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:39.172173+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:40.172339+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213622784 unmapped: 66486272 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:41.172546+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:42.172727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:43.172912+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:44.173082+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:45.173297+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:46.173562+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:47.173764+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:48.174106+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213630976 unmapped: 66478080 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:49.174374+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:50.174608+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:51.174905+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:52.175097+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:53.175324+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:54.175599+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213639168 unmapped: 66469888 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:55.175918+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:56.176148+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:57.176419+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:58.176679+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:01:59.176872+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:00.177032+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:01.177277+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:02.177506+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:03.177723+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:04.177908+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:05.178131+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:06.178334+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213647360 unmapped: 66461696 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:07.178499+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:08.178748+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:09.178901+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:10.179283+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:11.179458+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:12.179756+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:13.179928+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:14.180146+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213655552 unmapped: 66453504 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:15.180313+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:16.182292+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:17.182943+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:18.184161+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:19.184684+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:20.185015+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213663744 unmapped: 66445312 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:21.185305+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:22.185663+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:23.185857+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:24.186015+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:25.186272+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:26.186536+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:27.186788+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213671936 unmapped: 66437120 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:28.187032+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:29.187264+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:30.187607+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:31.188038+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:32.188219+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:33.188415+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:34.188603+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:35.188887+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:36.189057+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:37.189218+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213680128 unmapped: 66428928 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:38.189434+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:39.189686+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:40.189883+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:41.189990+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:42.190246+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:43.190471+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:44.190736+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:45.191047+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213696512 unmapped: 66412544 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:46.192113+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:47.192428+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:48.192690+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:49.192869+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:50.193041+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:51.193268+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:52.193483+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:53.193736+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213704704 unmapped: 66404352 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:54.193945+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213712896 unmapped: 66396160 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:55.194133+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213712896 unmapped: 66396160 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:56.194250+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213712896 unmapped: 66396160 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:57.194457+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213712896 unmapped: 66396160 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:58.194721+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:02:59.194883+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:00.195156+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:01.195361+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:02.195595+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:03.195960+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:04.196113+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:05.196303+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:06.196502+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:07.196701+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:08.196963+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:09.197124+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213721088 unmapped: 66387968 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:10.197306+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:11.197544+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:12.197709+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:13.197890+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:14.198055+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:15.198246+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:16.198430+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:17.198723+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213729280 unmapped: 66379776 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:18.199016+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:19.199222+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:20.199564+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:21.199727+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:22.200159+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:23.200817+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:24.201349+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _send_mon_message to mon.compute-1 at v2:192.168.122.101:3300/0
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:25.201657+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:26.201842+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:27.202007+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213737472 unmapped: 66371584 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:28.202193+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213745664 unmapped: 66363392 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:29.202337+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213762048 unmapped: 66347008 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:30.202497+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: osd.1 146 heartbeat osd_stat(store_statfs(0x4f3209000/0x0/0x4ffc00000, data 0x21d0125/0x22a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0xa74f9c5), peers [0] op hist [])
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}'
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Sep 30 19:04:02 compute-1 ceph-osd[78006]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Sep 30 19:04:02 compute-1 ceph-osd[78006]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2065881 data_alloc: 218103808 data_used: 2719744
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213401600 unmapped: 66707456 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:31.202665+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: prioritycache tune_memory target: 4294967296 mapped: 213770240 unmapped: 66338816 heap: 280109056 old mem: 2845415832 new mem: 2845415832
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: tick
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_tickets
Sep 30 19:04:02 compute-1 ceph-osd[78006]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-09-30T19:03:32.202851+0000)
Sep 30 19:04:02 compute-1 ceph-osd[78006]: do_command 'log dump' '{prefix=log dump}'
Sep 30 19:04:03 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:03 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:03 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:03.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.28143 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2916086542' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.19942 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: pgmap v2580: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2996930861' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.28151 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.19950 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/209043629' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.28159 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: from='client.19954 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr services"} v 0)
Sep 30 19:04:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/244187955' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 19:04:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:03 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:03 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:03 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr versions"} v 0)
Sep 30 19:04:03 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1004522143' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/914316547' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/244187955' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.19964 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.28169 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2935998326' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1004522143' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/400747515' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.28177 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1099182053' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon stat"} v 0)
Sep 30 19:04:04 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3676917808' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 19:04:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:04 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:04 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:04 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:04:04 compute-1 crontab[323537]: (root) LIST (root)
Sep 30 19:04:04 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:04 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:04 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:04.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:05 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:05 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:05 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:05.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "node ls"} v 0)
Sep 30 19:04:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/280760085' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3676917808' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1857942967' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.28185 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2065453627' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: pgmap v2581: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.28193 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1987872530' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1862213668' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1109097292' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 19:04:05 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/280760085' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Sep 30 19:04:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:05 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:05 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:05 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Sep 30 19:04:05 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2196663918' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 19:04:05 compute-1 podman[249638]: time="2025-09-30T19:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 19:04:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 36769 "" "Go-http-client/1.1"
Sep 30 19:04:05 compute-1 podman[249638]: @ - - [30/Sep/2025:19:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 8368 "" "Go-http-client/1.1"
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.28201 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2060393709' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.28207 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2196663918' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2067289253' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1866048963' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/783253440' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.28217 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1713804743' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1941053519' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Sep 30 19:04:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3466285778' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 19:04:06 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:06 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.576272) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046576308, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 1911, "num_deletes": 257, "total_data_size": 4390580, "memory_usage": 4461016, "flush_reason": "Manual Compaction"}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046588348, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 2857143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 67960, "largest_seqno": 69866, "table_properties": {"data_size": 2849148, "index_size": 4680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 18239, "raw_average_key_size": 20, "raw_value_size": 2832508, "raw_average_value_size": 3207, "num_data_blocks": 204, "num_entries": 883, "num_filter_entries": 883, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759258891, "oldest_key_time": 1759258891, "file_creation_time": 1759259046, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 12118 microseconds, and 5749 cpu microseconds.
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.588387) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 2857143 bytes OK
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.588406) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.590163) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.590174) EVENT_LOG_v1 {"time_micros": 1759259046590171, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.590192) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 4381650, prev total WAL file size 4381650, number of live WAL files 2.
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.591183) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353131' seq:72057594037927935, type:22 .. '6C6F676D0032373634' seq:0, type:0; will stop at (end)
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(2790KB)], [141(10MB)]
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046591295, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 13605233, "oldest_snapshot_seqno": -1}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 8802 keys, 13457809 bytes, temperature: kUnknown
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046676504, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 13457809, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13404900, "index_size": 29820, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 233649, "raw_average_key_size": 26, "raw_value_size": 13253581, "raw_average_value_size": 1505, "num_data_blocks": 1153, "num_entries": 8802, "num_filter_entries": 8802, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1759253879, "oldest_key_time": 0, "file_creation_time": 1759259046, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2bf561c2-71cd-475c-b1c0-9f13ad2b054d", "db_session_id": "EOVNASF3CCDBD5EL5S5F", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.676861) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 13457809 bytes
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.678351) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.4 rd, 157.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 10.3 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(9.5) write-amplify(4.7) OK, records in: 9332, records dropped: 530 output_compression: NoCompression
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.678406) EVENT_LOG_v1 {"time_micros": 1759259046678368, "job": 90, "event": "compaction_finished", "compaction_time_micros": 85344, "compaction_time_cpu_micros": 51421, "output_level": 6, "num_output_files": 1, "total_output_size": 13457809, "num_input_records": 9332, "num_output_records": 8802, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046679398, "job": 90, "event": "table_file_deletion", "file_number": 143}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: EVENT_LOG_v1 {"time_micros": 1759259046682648, "job": 90, "event": "table_file_deletion", "file_number": 141}
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.591058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.682733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.682740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.682742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.682744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: rocksdb: (Original Log Time 2025/09/30-19:04:06.682745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Sep 30 19:04:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Sep 30 19:04:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1769819485' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 19:04:06 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:06 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:06 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:06.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:06 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Sep 30 19:04:06 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/795006569' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:07 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:07 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:07.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Sep 30 19:04:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2502588670' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4168390125' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4248394770' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3466285778' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: pgmap v2582: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4087842539' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1769819485' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.20040 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.20044 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/795006569' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2502588670' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Sep 30 19:04:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:07 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:07 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Sep 30 19:04:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4052718143' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 19:04:07 compute-1 sudo[323924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Sep 30 19:04:07 compute-1 sudo[323924]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:04:07 compute-1 sudo[323924]: pam_unix(sudo:session): session closed for user root
Sep 30 19:04:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Sep 30 19:04:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2145157587' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:04:07 compute-1 nova_compute[238822]: 2025-09-30 19:04:07.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:07 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Sep 30 19:04:07 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3814407398' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd metadata"} v 0)
Sep 30 19:04:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2726757586' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.20048 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.20052 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/4052718143' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2145157587' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.20062 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3814407398' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2221413528' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2726757586' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Sep 30 19:04:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/950578681' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:08 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:08 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:08 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 19:04:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd utilization"} v 0)
Sep 30 19:04:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2325417661' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 19:04:08 compute-1 systemd[1]: Started Hostname Service.
Sep 30 19:04:08 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Sep 30 19:04:08 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2981551538' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 19:04:08 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:08 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:08 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:08.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:09 compute-1 nova_compute[238822]: 2025-09-30 19:04:09.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:04:09 compute-1 nova_compute[238822]: 2025-09-30 19:04:09.057 2 DEBUG nova.compute.manager [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 19:04:09 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:09 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000026s ======
Sep 30 19:04:09 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:09.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Sep 30 19:04:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Sep 30 19:04:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1952010579' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.20072 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/950578681' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: pgmap v2583: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.20078 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2325417661' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/793667937' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2981551538' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.20088 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/643641839' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1952010579' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Sep 30 19:04:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:09 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:09 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:09 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:04:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:09 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:10 compute-1 nova_compute[238822]: 2025-09-30 19:04:10.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:04:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.28269 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.20100 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/767955293' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.28277 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.28281 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.28291 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:10 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3040008997' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 19:04:10 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:10 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "quorum_status"} v 0)
Sep 30 19:04:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3004934123' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 19:04:10 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:10 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:10 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:10.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:10 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "versions"} v 0)
Sep 30 19:04:10 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/715233239' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 19:04:11 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:11 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:11 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:11.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:11 compute-1 ceph-mon[75484]: pgmap v2584: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.28297 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.20128 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/3004934123' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.28305 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3076856318' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/715233239' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:11 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Sep 30 19:04:11 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2986511073' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 19:04:11 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Sep 30 19:04:11 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1746201692' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:12 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:12 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.28313 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2986511073' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/412535743' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.28321 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1746201692' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3684697327' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Sep 30 19:04:12 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/450248131' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 19:04:12 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "config dump"} v 0)
Sep 30 19:04:12 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/18398888' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 19:04:12 compute-1 nova_compute[238822]: 2025-09-30 19:04:12.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:04:12 compute-1 nova_compute[238822]: 2025-09-30 19:04:12.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:04:12 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:12 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:12 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:12.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:13 compute-1 nova_compute[238822]: 2025-09-30 19:04:13.057 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:04:13 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:13 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.001000027s ======
Sep 30 19:04:13 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:13.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Sep 30 19:04:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Sep 30 19:04:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2489630526' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 19:04:13 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:13 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:13 compute-1 ceph-mon[75484]: from='client.28329 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Sep 30 19:04:13 compute-1 ceph-mon[75484]: pgmap v2585: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 511 B/s rd, 0 op/s
Sep 30 19:04:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/18398888' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Sep 30 19:04:13 compute-1 ceph-mon[75484]: from='client.20156 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/2357266147' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 19:04:13 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2489630526' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Sep 30 19:04:13 compute-1 podman[324651]: 2025-09-30 19:04:13.538443645 +0000 UTC m=+0.075957638 container health_status fad4cee20567e69fc36cb1f5078f4aa449d2897763f57c361a67ee5cef4b26d0 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 19:04:13 compute-1 podman[324648]: 2025-09-30 19:04:13.603442187 +0000 UTC m=+0.139445009 container health_status 93e95faf2fa432e956f75d01cdf6e6b76601272e12cccc6133dd68116a068dcc (image=38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 19:04:13 compute-1 sudo[324678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Sep 30 19:04:13 compute-1 sudo[324678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:04:13 compute-1 sudo[324678]: pam_unix(sudo:session): session closed for user root
Sep 30 19:04:13 compute-1 sudo[324744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/63d32c6a-fa18-54ed-8711-9a3915cc367b/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Sep 30 19:04:13 compute-1 sudo[324744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Sep 30 19:04:13 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df"} v 0)
Sep 30 19:04:13 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2642681482' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 19:04:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs dump"} v 0)
Sep 30 19:04:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2968927765' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 19:04:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:14 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:14 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Sep 30 19:04:14 compute-1 ceph-mon[75484]: from='client.28349 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/1192387500' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 19:04:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2642681482' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Sep 30 19:04:14 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2968927765' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Sep 30 19:04:14 compute-1 sudo[324744]: pam_unix(sudo:session): session closed for user root
Sep 30 19:04:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Sep 30 19:04:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/412147261' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 19:04:14 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "fs ls"} v 0)
Sep 30 19:04:14 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2257068152' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 19:04:14 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:14 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:14 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:14.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:15 compute-1 nova_compute[238822]: 2025-09-30 19:04:15.053 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:04:15 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:15 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:15 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:15.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:15 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:15 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='client.20172 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: pgmap v2586: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 767 B/s rd, 0 op/s
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/412147261' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: pgmap v2587: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 604 B/s rd, 0 op/s
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' 
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='mgr.14526 192.168.122.100:0/3226362960' entity='mgr.compute-0.efvthf' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2257068152' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: from='client.20178 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:15 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mds stat"} v 0)
Sep 30 19:04:15 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/494616576' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.056 2 DEBUG oslo_service.periodic_task [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 19:04:16 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "mon dump"} v 0)
Sep 30 19:04:16 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/6242083' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 19:04:16 compute-1 sshd-session[324900]: Invalid user debian from 192.210.160.141 port 55462
Sep 30 19:04:16 compute-1 sshd-session[324900]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 19:04:16 compute-1 sshd-session[324900]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.210.160.141
Sep 30 19:04:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:16 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:16 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:16 compute-1 podman[325103]: 2025-09-30 19:04:16.510157445 +0000 UTC m=+0.120904292 container health_status 64a4fdc0a623fe0554226c6077ff73fa60cb151ef80b3b86d399afa387a3ae09 (image=38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.129.56.221:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.28375 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.20182 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/494616576' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/377434133' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/4218910979' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Sep 30 19:04:16 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/6242083' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.574 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.574 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 19:04:16 compute-1 nova_compute[238822]: 2025-09-30 19:04:16.575 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:04:16 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:16 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:16 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.100 - anonymous [30/Sep/2025:19:04:16.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:16 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Sep 30 19:04:16 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2932306974' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Sep 30 19:04:17 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1447958296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.095 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:04:17 compute-1 radosgw[84864]: ====== starting new request req=0x7fda5f0115d0 =====
Sep 30 19:04:17 compute-1 radosgw[84864]: ====== req done req=0x7fda5f0115d0 op status=0 http_status=200 latency=0.000000000s ======
Sep 30 19:04:17 compute-1 radosgw[84864]: beast: 0x7fda5f0115d0: 192.168.122.101 - anonymous [30/Sep/2025:19:04:17.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.294 2 WARNING nova.virt.libvirt.driver [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.296 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.324 2 DEBUG oslo_concurrency.processutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.325 2 DEBUG nova.compute.resource_tracker [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4410MB free_disk=39.992183685302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.326 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.327 2 DEBUG oslo_concurrency.lockutils [None req-6297af57-0bc4-473c-aa66-a0832ab4b702 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 19:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:17 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:17 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.20200 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.28389 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: pgmap v2588: 353 pgs: 353 active+clean; 41 MiB data, 405 MiB used, 40 GiB / 40 GiB avail; 604 B/s rd, 0 op/s
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.20204 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/2932306974' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.101:0/1447958296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/946550237' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Sep 30 19:04:17 compute-1 ceph-mon[75484]: from='client.? 192.168.122.100:0/3656320152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 19:04:17 compute-1 nova_compute[238822]: 2025-09-30 19:04:17.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 19:04:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd dump"} v 0)
Sep 30 19:04:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/965230135' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Sep 30 19:04:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-rgw-default-compute-1-wuqpyu[87408]: Tue Sep 30 19:04:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:18 compute-1 ceph-63d32c6a-fa18-54ed-8711-9a3915cc367b-keepalived-nfs-cephfs-compute-1-zmigik[86832]: Tue Sep 30 19:04:18 2025: (VI_0) received an invalid passwd!
Sep 30 19:04:18 compute-1 ceph-mon[75484]: mon.compute-1@1(peon) e2 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Sep 30 19:04:18 compute-1 ceph-mon[75484]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/96474844' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Sep 30 19:04:18 compute-1 sshd-session[324900]: Failed password for invalid user debian from 192.210.160.141 port 55462 ssh2
